Skip to main content

DuckLake provider for Apache Airflow (based on DuckDB)

Project description

DuckLake Provider for Apache Airflow

This is a custom provider for integrating DuckLake (based on DuckDB) with Apache Airflow.

DuckLake Configuration

The DuckLakeHook uses Airflow connection fields and extras to configure the connection. Standard fields are relabeled for common use:

  • Host: Used for metadata host (e.g., Postgres/MySQL host) or file path (e.g., for DuckDB/SQLite metadata file).
  • Login: Username (for Postgres/MySQL).
  • Password: Password (for Postgres/MySQL).
  • Schema: Metadata schema (defaults to 'duckdb').
  • Extra: JSON dict for all other settings (required for engine, storage_type, and conditional fields).

Example extras JSON (adjust based on engine and storage_type): { "engine": "postgres", "dbname": "my_ducklake", "pgdbname": "dev_nophiml_db", "storage_type": "s3", "s3_bucket": "your-s3-bucket", "s3_path": "your/s3/path/", "aws_access_key_id": "your-access-key-id", "aws_secret_access_key": "your-secret-access-key", "aws_region": "us-east-1", "install_extensions": ["spatial"], # Optional: Inherited from DuckDB provider "load_extensions": ["spatial"], # Optional "connect_stack": [ # Optional: override default DuckLake install/load commands "INSTALL httpfs;", "LOAD httpfs;", "INSTALL ducklake;", "LOAD ducklake;" ] }

Supported Engines (set in extras['engine'])

  • duckdb: Requires 'metadata_file' in extras or host as file path.
  • sqlite: Requires 'metadata_file' in extras or host as file path.
  • postgres: Requires host, login, password, and 'pgdbname' in extras.
  • mysql: Requires host, login, password, and 'mysqldbname' in extras.

Supported Storage Types (set in extras['storage_type'], default 's3')

  • s3: Requires 's3_bucket', 's3_path'; optional AWS creds.
  • azure: Requires 'azure_account_name', 'azure_container', 'azure_path'; optional connection_string.
  • gcs: Requires 'gcs_bucket', 'gcs_path'; optional service_account_key (JSON string).
  • local: Requires 'local_data_path'.

The UI shows core fields; use extras for engine/storage-specific ones. For dynamic behavior, select engine/storage in extras and provide corresponding keys. If you need to customize the static DuckLake connection commands (for example to install additional extensions), provide a connect_stack list in extras. Commands that depend on runtime variables (secrets, thread settings, attachments, etc.) are always appended automatically by the hook.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_provider_ducklake-0.0.6.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_provider_ducklake-0.0.6-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file airflow_provider_ducklake-0.0.6.tar.gz.

File metadata

File hashes

Hashes for airflow_provider_ducklake-0.0.6.tar.gz
Algorithm Hash digest
SHA256 06d23c07364229a487d9d479af1ecad5d5bceaa87d9b1df5f59618b9042c6f08
MD5 2d0fed8d88eff03a5c7c2c6b489611af
BLAKE2b-256 7150ef5a2880ad6078ffeb89c314c1049cbed6c328de11357f8259a8eb98f0c8

See more details on using hashes here.

File details

Details for the file airflow_provider_ducklake-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_provider_ducklake-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 4052156164c19c8f68037bc373a2e270dc839f14912de8a7c7bac51c72b0ddce
MD5 532eb63e2129713644b7372e14074769
BLAKE2b-256 34b2708d6a078d39341b1a152d6a8b51bd1919e1d3fbe23c65d3dda618c7a908

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page