Skip to main content

Collection of Python tools to generate Airflow DAGs from control and configuration files.

Project description

Airflow Bootstrap Utils

Collection of Python tools to generate Airflow DAGs from control and configuration files.

Exported Console Scripts

  • airflow-generate-dag: Generate Airflow DAGs from control and configuration files.

  • airflow-validate-control-file: Validate control files.

Installation

`shell pip install airflow-bootstrap-utils `

Usage

generate-airflow-dag-script --control_file /tmp/airflow-demo/simple_workflow.yaml --outdir /tmp/airflow-demo
--config_file was not specified and therefore was set to '/tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/conf/config.yaml'
--template_dir was not specified and therefore was set to '/tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/templates'
--logfile was not specified and therefore was set to '/tmp/airflow-demo/generate_dag.log'
Wrote Airflow DAG Python script list to file '/tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/airflow_dag_scripts.txt'

Contents of airflow_dag_scripts.txt:

cat /tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/airflow_dag_scripts.txt
## method-created: /tmp/airflow-bootstrap-utils/venv/lib/python3.10/site-packages/airflow_bootstrap_utils/manager.py
## date-created: 2025-02-15-101314
## created-by: sundaram
## control-file: /tmp/airflow-demo/simple_workflow.yaml
## logfile: /tmp/airflow-demo/generate_dag.log
/tmp/sundaram/airflow-bootstrap-utils/user_processing_L1/2025-02-15-101314/L1/user_processing_L1.airflow.dag.py
/tmp/sundaram/airflow-bootstrap-utils/user_processing_L2/2025-02-15-101314/L2/user_processing_L2.airflow.dag.py

Contents of L1 airflow script:

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime

#------------------------------
# Airflow DAG definition
#------------------------------
dag = DAG(
    'user_processing_L1',
    description='This is an example DAG in the airflow-bootstrap-utils.  Please provide a better description in your instance in your copy of the control file.',
    schedule_interval='“@weekly”',
    start_date=datetime(2024,3,22),
    catchup=False,
    )

#------------------------------
# Airflow task defintions
#------------------------------

"""
The do_something executable is going to do something.
It is going to use the lab number argument to do
something.
The lab number is a required parameter.
"""
do_something = BashOperator(
    task_id='do_something',
    bash_command='bash /opt/do_something.sh --lab_number L1',
    dag=dag
)


"""
The do_something_else executable is going to do something else.
It is going to use the lab number argument to do
something else.
The lab number is a required parameter.
"""
do_something_else = BashOperator(
    task_id='do_something_else',
    bash_command='bash /opt/do_something_else.sh --lab_number L1',
    dag=dag
)

#------------------------------
# Define the task dependencies
#------------------------------
do_something >> do_something_else

Contents of L2 airflow script:

from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime

#------------------------------
# Airflow DAG definition
#------------------------------
dag = DAG(
    'user_processing_L2',
    description='This is an example DAG in the airflow-bootstrap-utils.  Please provide a better description in your instance in your copy of the control file.',
    schedule_interval='“@weekly”',
    start_date=datetime(2024,3,22),
    catchup=False,
    )

#------------------------------
# Airflow task defintions
#------------------------------

"""
The do_something executable is going to do something.
It is going to use the lab number argument to do
something.
The lab number is a required parameter.
"""
do_something = BashOperator(
    task_id='do_something',
    bash_command='bash /opt/do_something.sh --lab_number L1',
    dag=dag
)


"""
The do_something_else executable is going to do something else.
It is going to use the lab number argument to do
something else.
The lab number is a required parameter.
"""
do_something_else = BashOperator(
    task_id='do_something_else',
    bash_command='bash /opt/do_something_else.sh --lab_number L1',
    dag=dag
)

#------------------------------
# Define the task dependencies
#------------------------------
do_something >> do_something_else

History

0.1.0 (2024-03-20)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_bootstrap_utils-0.2.1.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_bootstrap_utils-0.2.1-py2.py3-none-any.whl (15.3 kB view details)

Uploaded Python 2Python 3

File details

Details for the file airflow_bootstrap_utils-0.2.1.tar.gz.

File metadata

  • Download URL: airflow_bootstrap_utils-0.2.1.tar.gz
  • Upload date:
  • Size: 18.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.12

File hashes

Hashes for airflow_bootstrap_utils-0.2.1.tar.gz
Algorithm Hash digest
SHA256 ee522e658c1882629397aadf974224970ff8617debf6efa3627fcff17a3e8254
MD5 0eed5b8174647ee50bd9c8f7e0422254
BLAKE2b-256 7a885c03778c5de215e6f1e8209dd9fcf07e2c5bfea1781155c83bebe570a16f

See more details on using hashes here.

File details

Details for the file airflow_bootstrap_utils-0.2.1-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_bootstrap_utils-0.2.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f5507e584ff5b97288eacacb3a3fd270ae7b6a40308fdf377b53aecb6244414d
MD5 3a18556a975add24addcf4ea4107efe8
BLAKE2b-256 aad4035cabd2a2bf5d145bdec91f6c3985e9a08f0930ff9dc5d52da646e5c841

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page