Skip to main content

A Python pacakge to help Databricks Unity Catalog users to read and query Delta Lake tables with `Polars`, `DuckDb`, or `PyArrow`.

Project description

LakeScum

A Python pacakge to help Databricks Unity Catalog users to read and query Delta Lake tables with Polars, DuckDb, or PyArrow.

Unity Catalog does not place nice out-of-the-box with many of these tools using built in features like polars.read_delta() for example.

LakeScum takes that difficulty away.

Installation

LakeScum can be installed for Python with a simple pip command. pip install LakeScum

Usage

There are currently the methods to read and query a Unity Catalog Delta Lake with ...

  • Polars
  • DuckDb
  • PyArrow

Polars

You can query and return a Polars Dataframe from a Unity Catalog Delta Lake table with the following method. unity_catalog_delta_to_polars()

It takes 2 required parameters, and one optional.

spark: str - Spark Session 
table_name: str - Unity Catalog table name
sql_filter - Optional SQL WHERE clause filter

Example ...

polars_df = unity_catalog_delta_to_polars(spark, 
    'production.default.fact_orders',
    sql_filter="year = 2024 and month = 3 and day = 10")
print(polars_df.head(10))
order_id | product_id | order_date | quantity
1 | 4567 | '2024-03-10' | 5
DuckDb

This method will register a Unity Catalog Delta Table as a DuckDB table so you can query it with DuckDB. unity_catalog_delta_register_to_duckdb()

It takes 3 required parameters, and one optional.

spark: str - Spark Session 
unity_table_name: str - Unity Catalog table name
duck_table_name: str - Desired DuckDB table name
sql_filter - Optional SQL WHERE clause filter

Example ...

unity_catalog_delta_register_to_duckdb(spark, 
    "production.default.fact_orders",
    "test", 
    sql_filter="year = 2024 and month =3 and day = 19")
results = duckdb.sql("SELECT * FROM test")
print(results)
order_id | product_id | order_date | quantity
1 | 4567 | '2024-03-10' | 5
PyArrow

This method will return a PyArrow Table from a Unity Catalog Delta Table. unity_catalog_delta_to_pyarrow()

It takes 2 required parameters, and one optional.

pa = unity_catalog_delta_to_pyarrow(spark,
 "production.default.fact_orders",
  sql_filter="year = 2024 and month =3 and day = 19")

print(pa)
order_id | product_id | order_date | quantity
1 | 4567 | '2024-03-10' | 5

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lakescum-0.1.3.tar.gz (6.0 kB view details)

Uploaded Source

Built Distribution

lakescum-0.1.3-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file lakescum-0.1.3.tar.gz.

File metadata

  • Download URL: lakescum-0.1.3.tar.gz
  • Upload date:
  • Size: 6.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/23.4.0

File hashes

Hashes for lakescum-0.1.3.tar.gz
Algorithm Hash digest
SHA256 e9d7d9d17d9873f24060aee29922e4e08247cf40675338566f21176f5f4dd08c
MD5 44c51127c48d22159c8d71777527345b
BLAKE2b-256 9ab8ab8811c6c533c4a551302689d46cadc50b77ad9a0e24309625495455c37f

See more details on using hashes here.

File details

Details for the file lakescum-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: lakescum-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/23.4.0

File hashes

Hashes for lakescum-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 be29c8afbff14d9b06bbc08690078fa0445d4d33e2886582ea94ef68ed2c0742
MD5 6996872f88edf8c6a0737ec87f0e0d2c
BLAKE2b-256 f27d8d9ccc6a9e1c190d71b853a9df2f2e9d894fa64c825b3a7cbf819c7abfad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page