Skip to main content

Pydantic models for Apache Airflow

Project description

airflow-pydantic

Pydantic models for Apache Airflow

Build Status codecov License PyPI

Overview

Pydantic models of Apache Airflow data structures.

Primary Use Case: This library is designed to enable declarative DAG definitions using airflow-config or other YAML/JSON-based configuration frameworks. By representing Airflow constructs as Pydantic models, DAGs can be defined in configuration files rather than Python code, enabling better separation of concerns, easier testing, and configuration-driven workflows.

Core

Operators

Sensors

Other

Usage

Declarative DAGs with airflow-config (Recommended)

The primary use of airflow-pydantic is to build declarative, configuration-driven DAGs using airflow-config or similar YAML/JSON-based frameworks:

# config/my_dag.yaml
default_args:
  _target_: airflow_pydantic.TaskArgs
  owner: data-team
  retries: 3

default_dag_args:
  _target_: airflow_pydantic.DagArgs
  schedule: "@daily"
  start_date: "2024-01-01"
  catchup: false

This approach allows you to:

  • Define DAGs in YAML/JSON instead of Python
  • Separate configuration from code
  • Easily manage environment-specific settings
  • Version control your DAG configurations
  • Generate and validate DAGs programmatically

Programmatic Usage

All operators and sensors support two methods:

  • instantiate(): Create a concrete Airflow instance at runtime
  • render(): Generate Python code as a string for the Airflow construct

Code Generation with render()

The render() method generates valid Python code from your Pydantic models, enabling code generation workflows:

from airflow_pydantic import Dag, BashTask
from datetime import datetime

dag = Dag(
    dag_id="generated-dag",
    schedule="@daily",
    start_date=datetime(2024, 1, 1),
    tasks={
        "hello": BashTask(
            task_id="hello",
            bash_command="echo 'Hello World'",
        ),
    },
)

# Generate Python code
python_code = dag.render()

# Save to a DAG file
with open("dags/generated_dag.py", "w") as f:
    f.write(python_code)

Generated File:

from datetime import datetime

from airflow.models import DAG
from airflow.providers.standard.operators.bash import BashOperator

with DAG(schedule="@daily", start_date=datetime.fromisoformat("2024-01-01T00:00:00"), dag_id="generated-dag") as dag:
    hello = BashOperator(bash_command="echo 'Hello World'", task_id="hello", dag=dag)

This is useful for:

  • Generating DAG files from configuration during CI/CD
  • Creating DAG templates programmatically
  • Migrating from configuration-driven to static DAG files
  • Debugging and inspecting generated DAG code

[!NOTE] This library was generated using copier from the Base Python Project Template repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_pydantic-1.5.9.tar.gz (72.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_pydantic-1.5.9-py3-none-any.whl (108.3 kB view details)

Uploaded Python 3

File details

Details for the file airflow_pydantic-1.5.9.tar.gz.

File metadata

  • Download URL: airflow_pydantic-1.5.9.tar.gz
  • Upload date:
  • Size: 72.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for airflow_pydantic-1.5.9.tar.gz
Algorithm Hash digest
SHA256 2bed786cbd9220b0a24fdcf294a5e760ccd7f8360b796114ce7f59a057dbdfb7
MD5 606c87ec7066b03099a27ef757f13b17
BLAKE2b-256 da6886b837531b29fb71a5a74a1412b00c182e28956bb9a2a6f0951d19bfda1c

See more details on using hashes here.

File details

Details for the file airflow_pydantic-1.5.9-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_pydantic-1.5.9-py3-none-any.whl
Algorithm Hash digest
SHA256 406894e8a24d8fdb796db58b136a12b9a176ef66aeae7e561a4a410fb88fbe4c
MD5 38757b5e7a8abc31abf398dd01ddc2ba
BLAKE2b-256 2fba0ccd12c7c96a1aa8b03ec14535dc8f981d62026e936559961eb7496a8bff

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page