Skip to main content

DuckLake provider for Apache Airflow (based on DuckDB)

Project description

DuckLake Provider for Apache Airflow

This is a custom provider for integrating DuckLake (based on DuckDB) with Apache Airflow.

DuckLake Configuration

The DuckLakeHook uses Airflow connection fields and extras to configure the connection. Standard fields are relabeled for common use:

  • Host: Used for metadata host (e.g., Postgres/MySQL host) or file path (e.g., for DuckDB/SQLite metadata file).
  • Login: Username (for Postgres/MySQL).
  • Password: Password (for Postgres/MySQL).
  • Schema: Metadata schema (defaults to 'duckdb').
  • Extra: JSON dict for all other settings (required for engine, storage_type, and conditional fields).

Example extras JSON (adjust based on engine and storage_type):

{
  "engine": "postgres",
  "dbname": "my_ducklake",
  "pgdbname": "dev_nophiml_db",
  "storage_type": "s3",
  "s3_bucket": "your-s3-bucket",
  "s3_path": "your/s3/path/",
  "aws_access_key_id": "your-access-key-id",
  "aws_secret_access_key": "your-secret-access-key",
  "aws_region": "us-east-1",
  "install_extensions": ["spatial"],  # Optional: Inherited from DuckDB provider
  "load_extensions": ["spatial"],     # Optional
  "connect_stack": [                  # Optional: override default DuckLake install/load commands
    "INSTALL httpfs;",
    "LOAD httpfs;",
    "INSTALL ducklake;",
    "LOAD ducklake;"
  ]
}

Supported Engines (set in extras['engine'])

  • duckdb: Requires 'metadata_file' in extras or host as file path.
  • sqlite: Requires 'metadata_file' in extras or host as file path.
  • postgres: Requires host, login, password, and 'pgdbname' in extras.
  • mysql: Requires host, login, password, and 'mysqldbname' in extras.

Supported Storage Types (set in extras['storage_type'], default 's3')

  • s3: Requires 's3_bucket', 's3_path'; optional AWS creds.
  • azure: Requires 'azure_account_name', 'azure_container', 'azure_path'; optional connection_string.
  • gcs: Requires 'gcs_bucket', 'gcs_path'; optional service_account_key (JSON string).
  • local: Requires 'local_data_path'.

The UI shows core fields; use extras for engine/storage-specific ones. For dynamic behavior, select engine/storage in extras and provide corresponding keys. If you need to customize the static DuckLake connection commands (for example to install additional extensions), provide a connect_stack list in extras. Commands that depend on runtime variables (secrets, thread settings, attachments, etc.) are always appended automatically by the hook.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_provider_ducklake-0.0.7.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_provider_ducklake-0.0.7-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file airflow_provider_ducklake-0.0.7.tar.gz.

File metadata

File hashes

Hashes for airflow_provider_ducklake-0.0.7.tar.gz
Algorithm Hash digest
SHA256 c191878a44c99a097d4a719fca12b0cdbb512bb7fe3ead2c09f6d5403e129dc9
MD5 767bdbed00e4921d184fad050d4cd663
BLAKE2b-256 c089b42593932a196fdaea07fbec661890366a32dafc90a234fdfb76ba473961

See more details on using hashes here.

File details

Details for the file airflow_provider_ducklake-0.0.7-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_provider_ducklake-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 d512dd2d9013acd612b82a5f0da554409c492d27b659c7286d5bfd683096e134
MD5 13a288eae6ffe28169eca99b4d13880b
BLAKE2b-256 183e8b138f3ba1e56aad385ff93549962f713f1e3bc4a782be4e8d5892d138fa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page