A comprehensive Python library for managing DBT (Data Build Tool) DAGs within the Fast.BI data development platform
Project description
Fast.BI DBT Runner
A comprehensive Python library for managing DBT (Data Build Tool) DAGs within the Fast.BI data development platform. This package provides multiple execution operators optimized for different cost-performance trade-offs, from low-cost slow execution to high-cost fast execution.
🚀 Overview
Fast.BI DBT Runner is part of the Fast.BI Data Development Platform, designed to provide flexible and scalable DBT workload execution across various infrastructure options. The package offers four distinct operator types, each optimized for specific use cases and requirements.
🎯 Key Features
- Multiple Execution Operators: Choose from K8S, Bash, API, or GKE operators
- Cost-Performance Optimization: Scale from low-cost to high-performance execution
- Airflow Integration: Seamless integration with Apache Airflow workflows
- Manifest Parsing: Intelligent DBT manifest parsing for dynamic DAG generation
- Airbyte Integration: Built-in support for Airbyte task group building
- Flexible Configuration: Extensive configuration options for various deployment scenarios
📦 Installation
Basic Installation (Core Package)
pip install fast-bi-dbt-runner
With Airflow Integration
pip install fast-bi-dbt-runner[airflow]
With Development Tools
pip install fast-bi-dbt-runner[dev]
With Documentation Tools
pip install fast-bi-dbt-runner[docs]
Complete Installation
pip install fast-bi-dbt-runner[airflow,dev,docs]
🏗️ Architecture
Operator Types
The package provides four different operators for running DBT transformation pipelines:
1. K8S (Kubernetes) Operator - Default Choice
- Best for: Cost optimization, daily/nightly jobs, high concurrency
- Characteristics: Creates dedicated Kubernetes pods per task
- Trade-offs: Most cost-effective but slower execution speed
- Use cases: Daily ETL pipelines, projects with less frequent runs
2. Bash Operator
- Best for: Balanced cost-speed ratio, medium-sized projects
- Characteristics: Runs within Airflow worker resources
- Trade-offs: Faster than K8S but limited by worker capacity
- Use cases: Medium-sized projects, workflows requiring faster execution
3. API Operator
- Best for: High performance, time-sensitive workflows
- Characteristics: Dedicated machine per project, always-on resources
- Trade-offs: Fastest execution but highest cost
- Use cases: Large-scale projects, real-time analytics, high-frequency execution
4. GKE Operator
- Best for: Complete isolation, external client workloads
- Characteristics: Creates dedicated GKE clusters
- Trade-offs: Full isolation but higher operational complexity
- Use cases: External client workloads, isolated environment requirements
🚀 Quick Start
Basic Usage
from fast_bi_dbt_runner import DbtManifestParserK8sOperator
# Create a K8S operator instance
operator = DbtManifestParserK8SOperator(
task_id='run_dbt_models',
project_id='my-gcp-project',
dbt_project_name='my_analytics',
operator='k8s'
)
# Execute DBT models
operator.execute(context)
Configuration Example
# K8S Operator Configuration
k8s_config = {
'PLATFORM': 'Airflow',
'OPERATOR': 'k8s',
'PROJECT_ID': 'my-gcp-project',
'DBT_PROJECT_NAME': 'my_analytics',
'DAG_SCHEDULE_INTERVAL': '@daily',
'DATA_QUALITY': 'True',
'DBT_SOURCE': 'True'
}
# API Operator Configuration
api_config = {
'PLATFORM': 'Airflow',
'OPERATOR': 'api',
'PROJECT_ID': 'my-gcp-project',
'DBT_PROJECT_NAME': 'realtime_analytics',
'DAG_SCHEDULE_INTERVAL': '*/15 * * * *',
'MODEL_DEBUG_LOG': 'True'
}
📚 Documentation
For detailed documentation, visit our Fast.BI Platform Documentation.
Key Documentation Sections
🔧 Configuration
Core Variables
| Variable | Description | Default Value |
|---|---|---|
PLATFORM |
Data orchestration platform | Airflow |
OPERATOR |
Execution operator type | k8s |
PROJECT_ID |
Google Cloud project identifier | Required |
DBT_PROJECT_NAME |
DBT project identifier | Required |
DAG_SCHEDULE_INTERVAL |
Pipeline execution schedule | @once |
Feature Flags
| Variable | Description | Default |
|---|---|---|
DBT_SEED |
Enable seed data loading | False |
DBT_SOURCE |
Enable source loading | False |
DBT_SNAPSHOT |
Enable snapshot creation | False |
DATA_QUALITY |
Enable quality service | False |
DEBUG |
Enable connection verification | False |
🎯 Use Cases
Daily ETL Pipeline
# Low-cost, reliable daily processing
config = {
'OPERATOR': 'k8s',
'DAG_SCHEDULE_INTERVAL': '@daily',
'DBT_SOURCE': 'True',
'DATA_QUALITY': 'True'
}
Real-time Analytics
# High-performance, frequent execution
config = {
'OPERATOR': 'api',
'DAG_SCHEDULE_INTERVAL': '*/15 * * * *',
'MODEL_DEBUG_LOG': 'True'
}
External Client Workload
# Isolated, dedicated resources
config = {
'OPERATOR': 'gke',
'CLUSTER_NAME': 'client-isolated-cluster',
'DATA_QUALITY': 'True'
}
🔍 Monitoring and Debugging
Enable Debug Logging
config = {
'DEBUG': 'True',
'MODEL_DEBUG_LOG': 'True'
}
Data Quality Integration
config = {
'DATA_QUALITY': 'True',
'DATAHUB_ENABLED': 'True'
}
🚀 CI/CD and Automation
This package uses GitHub Actions for continuous integration and deployment:
- Automated Testing: Tests across Python 3.9-3.12
- Code Quality: Linting, formatting, and type checking
- Automated Publishing: Automatic PyPI releases on version tags
- Documentation: Automated documentation building and deployment
Release Process
- Create a version tag:
git tag v1.0.0 - Push the tag:
git push origin v1.0.0 - GitHub Actions automatically:
- Tests the package
- Builds and validates
- Publishes to PyPI
- Creates a GitHub release
🤝 Contributing
We welcome contributions! Please see our Contributing Guidelines for details.
Development Setup
# Clone the repository
git clone https://github.com/fast-bi/dbt-workflow-core-runner.git
cd dbt-workflow-core-runner
# Install in development mode with all tools
pip install -e .[dev,airflow]
# Run tests
pytest
# Check code quality
flake8 fast_bi_dbt_runner/
black --check fast_bi_dbt_runner/
mypy fast_bi_dbt_runner/
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🆘 Support
- Documentation: Fast.BI Platform Wiki
- Email: support@fast.bi
- Issues: GitHub Issues
- Source: GitHub Repository
🔗 Related Projects
- Fast.BI Platform - Complete data development platform
- Fast.BI Replication Control - Data replication management
- Apache Airflow - Workflow orchestration platform
Fast.BI DBT Runner - Empowering data teams with flexible, scalable DBT execution across the Fast.BI platform.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fast_bi_dbt_runner-2026.1.0.3.tar.gz.
File metadata
- Download URL: fast_bi_dbt_runner-2026.1.0.3.tar.gz
- Upload date:
- Size: 47.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0fafdc7bb90b55c2b74e2e04ff70de19f11b8eded1507799fe1edae0c425b67f
|
|
| MD5 |
c371a3863444cbf45abfad0eb42a94b0
|
|
| BLAKE2b-256 |
da68e0e653fedce01e411afd8b5e5c52c97c3a1113d0c874e93a01b5383998df
|
File details
Details for the file fast_bi_dbt_runner-2026.1.0.3-py3-none-any.whl.
File metadata
- Download URL: fast_bi_dbt_runner-2026.1.0.3-py3-none-any.whl
- Upload date:
- Size: 43.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
64923fdcf1dafcc1c938d290f3b622062887dd5e674bcf58e970178b17364ed4
|
|
| MD5 |
4eba578bf18d83644d6d20536210a935
|
|
| BLAKE2b-256 |
c0f01358d7a5a14304d1759f2757a6aa76a504123d5f8be8b448744f7b92b1a8
|