Skip to main content

A modern, intuitive Python package for data lakehouse operations

Project description

superlake

A modern, intuitive Python package for data lakehouse operations

Main SuperLake Classes

  • SuperSpark: Instantiates a SparkSession with Delta Lake support.
  • SuperDeltaTable: Manages Delta tables (create, read, write, optimize, vacuum, SCD2, schema evolution, etc.).
  • SuperPipeline: Orchestrates data pipelines from source to bronze and silver layers, including CDC and transformation logic.
  • SuperGoldPipeline: Manages gold-layer aggregations and writes results to gold tables.
  • SuperDataframe: Utility class for DataFrame cleaning, casting, and manipulation.
  • SuperLogger: Logging and metrics for pipeline operations.

Quick Example Usage

from superlake.core import SuperSpark, SuperDeltaTable, TableSaveMode, SchemaEvolution, SuperPipeline, SuperGoldPipeline
from superlake.monitoring import SuperLogger
import pyspark.sql.types as T
from datetime import datetime

# Initialize Spark and logger
spark = SuperSpark()
logger = SuperLogger()
superlake_dt = datetime.now()

# Define a Delta table
bronze_customer = SuperDeltaTable(
    table_name="01_bronze.customer",
    table_path="/path/to/bronze/customer",
    table_schema=T.StructType([
        T.StructField("customer_id", T.StringType(), False),
        T.StructField("name", T.StringType(), True),
        T.StructField("email", T.StringType(), True),
        T.StructField("country", T.StringType(), True),
        T.StructField("signup_date", T.DateType(), True),
        T.StructField("superlake_dt", T.TimestampType(), True)
    ]),
    table_save_mode=TableSaveMode.Append,
    primary_keys=["customer_id"],
    partition_cols=["superlake_dt"],
    schema_evolution_option=SchemaEvolution.Merge,
    logger=logger,
    managed=True
)

# Define CDC and transformation functions
def customer_cdc(spark):
    # Return a DataFrame with new/changed customer data
    ...

def customer_tra(df):
    # Clean and transform customer data
    return df

# Create and run a pipeline
customer_pipeline = SuperPipeline(
    superlake_dt=superlake_dt,
    bronze_table=bronze_customer,
    silver_table=...,  # another SuperDeltaTable
    cdc_function=customer_cdc,
    tra_function=customer_tra,
    logger=logger,
    spark=spark,
    environment="test"
)
customer_pipeline.execute()

See example/superlake_example.py for a full pipeline example, including gold table aggregation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

superlake-0.1.1.tar.gz (16.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

superlake-0.1.1-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file superlake-0.1.1.tar.gz.

File metadata

  • Download URL: superlake-0.1.1.tar.gz
  • Upload date:
  • Size: 16.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for superlake-0.1.1.tar.gz
Algorithm Hash digest
SHA256 1cf99f11fe1a953c819063c956ffd6b8d4cb72407de2822b98d25554d997c1b8
MD5 66ce5639e29fbfaad9609f524b967ecc
BLAKE2b-256 5f7116fef1b8a8636c19bbcb200e9d58332a22136a7cdd3ac29cfdbfb6ffcbae

See more details on using hashes here.

File details

Details for the file superlake-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: superlake-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 17.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for superlake-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5108ec854f43daf18f5a8b92b7cec84c85acde9e88530d5b723e3350213584bc
MD5 29fd59b35162fad765fb541f26b50105
BLAKE2b-256 fe3908fb662c9b51729293871d252733fea431bad11809dca6281a14fd386a5e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page