Skip to main content

A modern, intuitive Python package for data lakehouse operations

Project description

superlake

A modern, intuitive Python package for data lakehouse operations

Main SuperLake Classes

  • SuperSpark: Instantiates a SparkSession with Delta Lake support.
  • SuperDeltaTable: Manages Delta tables (create, read, write, optimize, vacuum, SCD2, schema evolution, etc.).
  • SuperPipeline: Orchestrates data pipelines from source to bronze and silver layers, including CDC and transformation logic.
  • SuperGoldPipeline: Manages gold-layer aggregations and writes results to gold tables.
  • SuperDataframe: Utility class for DataFrame cleaning, casting, and manipulation.
  • SuperLogger: Logging and metrics for pipeline operations.

Quick Example Usage

from superlake.core import SuperSpark, SuperDeltaTable, TableSaveMode, SchemaEvolution, SuperPipeline, SuperGoldPipeline
from superlake.monitoring import SuperLogger
import pyspark.sql.types as T
from datetime import datetime

# Initialize Spark and logger
spark = SuperSpark()
logger = SuperLogger()
superlake_dt = datetime.now()

# Define a Delta table
bronze_customer = SuperDeltaTable(
    table_name="01_bronze.customer",
    table_path="/path/to/bronze/customer",
    table_schema=T.StructType([
        T.StructField("customer_id", T.StringType(), False),
        T.StructField("name", T.StringType(), True),
        T.StructField("email", T.StringType(), True),
        T.StructField("country", T.StringType(), True),
        T.StructField("signup_date", T.DateType(), True),
        T.StructField("superlake_dt", T.TimestampType(), True)
    ]),
    table_save_mode=TableSaveMode.Append,
    primary_keys=["customer_id"],
    partition_cols=["superlake_dt"],
    schema_evolution_option=SchemaEvolution.Merge,
    logger=logger,
    managed=True
)

# Define CDC and transformation functions
def customer_cdc(spark):
    # Return a DataFrame with new/changed customer data
    ...

def customer_tra(df):
    # Clean and transform customer data
    return df

# Create and run a pipeline
customer_pipeline = SuperPipeline(
    superlake_dt=superlake_dt,
    bronze_table=bronze_customer,
    silver_table=...,  # another SuperDeltaTable
    cdc_function=customer_cdc,
    tra_function=customer_tra,
    logger=logger,
    spark=spark,
    environment="test"
)
customer_pipeline.execute()

See example/superlake_example.py for a full pipeline example, including gold table aggregation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

superlake-0.1.0.tar.gz (16.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

superlake-0.1.0-py3-none-any.whl (16.6 kB view details)

Uploaded Python 3

File details

Details for the file superlake-0.1.0.tar.gz.

File metadata

  • Download URL: superlake-0.1.0.tar.gz
  • Upload date:
  • Size: 16.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for superlake-0.1.0.tar.gz
Algorithm Hash digest
SHA256 212a0442dc56b02f62310b21128de96522f1cf87c3e1562df2c7a54dcb8eb025
MD5 12d9d42b9ab1fa4f77f99708d32a506e
BLAKE2b-256 2a5c59062ec5193aa25a1d5ab542d4b8ef7093ea089f288206e0953da592863b

See more details on using hashes here.

File details

Details for the file superlake-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: superlake-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 16.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for superlake-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b194e04ba56054e05eca764d8c2be0eb75206ce4e2126744940490c7ff8ad570
MD5 06a83e7e11a1aa257ddb023da6b668c7
BLAKE2b-256 fb5fea8849abb1afad524c4ecb02c34a5f018dbc7fce7c821bdc0c877f321da2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page