DuckLake provider for Apache Airflow (based on DuckDB)
Project description
DuckLake Provider for Apache Airflow
This is a custom provider for integrating DuckLake (based on DuckDB) with Apache Airflow.
DuckLake Configuration
The DuckLakeHook uses Airflow connection fields and extras to configure the connection. Standard fields are relabeled for common use:
- Host: Used for metadata host (e.g., Postgres/MySQL host) or file path (e.g., for DuckDB/SQLite metadata file).
- Login: Username (for Postgres/MySQL).
- Password: Password (for Postgres/MySQL).
- Schema: Metadata schema (defaults to 'duckdb').
- Extra: JSON dict for all other settings (required for engine, storage_type, and conditional fields).
Example extras JSON (adjust based on engine and storage_type):
{
"engine": "postgres",
"dbname": "my_ducklake",
"pgdbname": "dev_nophiml_db",
"storage_type": "s3",
"s3_bucket": "your-s3-bucket",
"s3_path": "your/s3/path/",
"aws_access_key_id": "your-access-key-id",
"aws_secret_access_key": "your-secret-access-key",
"aws_region": "us-east-1",
"install_extensions": ["spatial"], # Optional: Inherited from DuckDB provider
"load_extensions": ["spatial"], # Optional
"connect_stack": [ # Optional: override default DuckLake install/load commands
"INSTALL httpfs;",
"LOAD httpfs;",
"INSTALL ducklake;",
"LOAD ducklake;"
]
}
Supported Engines (set in extras['engine'])
- duckdb: Requires 'metadata_file' in extras or host as file path.
- sqlite: Requires 'metadata_file' in extras or host as file path.
- postgres: Requires host, login, password, and 'pgdbname' in extras.
- mysql: Requires host, login, password, and 'mysqldbname' in extras.
Supported Storage Types (set in extras['storage_type'], default 's3')
- s3: Requires 's3_bucket', 's3_path'; optional AWS creds.
- azure: Requires 'azure_account_name', 'azure_container', 'azure_path'; optional connection_string.
- gcs: Requires 'gcs_bucket', 'gcs_path'; optional service_account_key (JSON string).
- local: Requires 'local_data_path'.
The UI shows core fields; use extras for engine/storage-specific ones. For dynamic behavior, select engine/storage in extras and provide corresponding keys.
If you need to customize the static DuckLake connection commands (for example to install additional extensions),
provide a connect_stack list in extras. Commands that depend on runtime variables (secrets, thread settings,
attachments, etc.) are always appended automatically by the hook.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file airflow_provider_ducklake-0.0.8.tar.gz.
File metadata
- Download URL: airflow_provider_ducklake-0.0.8.tar.gz
- Upload date:
- Size: 9.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7ff15b1e2298cf87388b2ff7406d3ac66608835fee5b50930e915ce15f3a26eb
|
|
| MD5 |
3ffa6faeb117c9dcb3b97ef09d25e275
|
|
| BLAKE2b-256 |
f6c811feacd98259f5f67c5131ab89fad34fbe17e75ee516df118e18889ae5ec
|
File details
Details for the file airflow_provider_ducklake-0.0.8-py3-none-any.whl.
File metadata
- Download URL: airflow_provider_ducklake-0.0.8-py3-none-any.whl
- Upload date:
- Size: 10.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e62beec185b89bdfac1a278e11d1873d9690fa904a4428cf6c5f4bd3d6d93feb
|
|
| MD5 |
9e986c619dcb512a7b717b74c1ae75df
|
|
| BLAKE2b-256 |
60ace0da69e0c3966a90b2ce2c6fb101aecb0876299bf3cf7197b63508f591ba
|