A configuration-driven framework for building Dagster pipelines
Project description
dagster-odp (open data platform)
dagster-odp simplifies data pipeline development by enabling teams to build Dagster pipelines through configuration rather than code. It reduces the learning curve for Dagster while promoting standardization and faster development of data workflows.
Key Features
-
Configuration-Driven Development: Build data pipelines using YAML/JSON instead of Python code
-
Pre-built Tasks:
- Google Cloud Operations: Transfer and export data between GCS and BigQuery, with support for GCS file downloads.
- DuckDB Operations: Load files into DuckDB, execute SQL queries, and export table contents to files.
- Utility Operations: Execute shell commands with configurable environments and working directories.
-
Extensible Framework: Create custom tasks, sensors, and resources that can be used directly in configuration files
-
Enhanced Modern Data Stack Integration:
- DLT+: Extended integration with automatic asset creation and granular object handling
- DBT+: Simplified variable management and external source configuration
- Soda: Configuration-driven data quality checks
-
Enhanced Asset Management:
- Standardized materialization metadata
- Simplified dependency management
- External source handling
-
Flexible Automation: Configuration-based jobs, schedules, sensors, and partitioning
Quick Example
Here's a simple pipeline that downloads data and loads it into DuckDB:
# odp_config/workflows/pipeline.yaml
assets:
- asset_key: raw_data
task_type: url_file_download
params:
source_url: https://example.com/data.parquet
destination_file_path: ./data/raw.parquet
- asset_key: analyzed_data
task_type: file_to_duckdb
depends_on: [raw_data]
params:
source_file_uri: "{{raw_data.destination_file_path}}"
destination_table_id: analyzed_table
Installation
pip install dagster-odp
Getting Started
- Create a new project using the Dagster CLI:
dagster project scaffold --name my-odp-project
cd my-odp-project
- Create the ODP configuration directories:
mkdir -p odp_config/workflows
- Update your definitions.py:
from dagster_odp import build_definitions
defs = build_definitions("odp_config")
- Start building pipelines in your workflows directory using YAML/JSON configuration.
Check out our Quickstart Guide for a complete walkthrough.
Who Should Use dagster-odp?
- Data Teams seeking to standardize pipeline creation
- Data Analysts/Scientists who want to create pipelines without extensive coding
- Data Engineers looking to reduce boilerplate code and maintenance overhead
- Organizations adopting Dagster who want to accelerate development
Documentation
Comprehensive documentation is available, including:
Contributing
Contributions are welcome! Please read our Contributing Guidelines for details on how to submit pull requests, report issues, and contribute to the project.
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file dagster_odp-0.1.0.tar.gz.
File metadata
- Download URL: dagster_odp-0.1.0.tar.gz
- Upload date:
- Size: 43.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b4c40f0087734e02241ead6e950a87331460e8524fc2cfb55505d1eea678a444
|
|
| MD5 |
12bf906c9a54f9f50491ea4660b04dd2
|
|
| BLAKE2b-256 |
2f6816942c4fa9bc3a64d7947e887575aab883ac20040ba1835449dfcb7b909a
|
File details
Details for the file dagster_odp-0.1.0-py3-none-any.whl.
File metadata
- Download URL: dagster_odp-0.1.0-py3-none-any.whl
- Upload date:
- Size: 53.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.9.20
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d6ddedeb694f87ebffe88665376d3f95f8092f076e72698e901648c6e1525d29
|
|
| MD5 |
a306f5d07ed15660f49ce36422666fd2
|
|
| BLAKE2b-256 |
b36699b71c25c5bcae306ea8704c62d4e98eb333e77c58de3610fa0d0e43ee71
|