Skip to main content

Apache Airflow API (Stable)

Project description

Apache Airflow Python Client

Requirements.

Python >= 3.7

Installation & Usage

pip install

You can install directly using pip:

pip install apache-airflow-client

Setuptools

Or install via Setuptools.

git clone git@github.com:apache/airflow-client-python.git
cd airflow-client-python
python setup.py install --user

(or sudo python setup.py install to install the package for all users)

Then import the package:

import airflow_client.client

Changelog

See CHANGELOG.md for keeping track on what has changed in the client.

Getting Started

Please follow the installation procedure and then run the following example python script:

import uuid

import airflow_client.client
try:
    # If you have rich installed, you will have nice colored output of the API responses
    from rich import print
except ImportError:
    print("Output will not be colored. Please install rich to get colored output: `pip install rich`")
    pass
from airflow_client.client.api import config_api, dag_api, dag_run_api
from airflow_client.client.model.dag_run import DAGRun

# The client must use the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
#
# In case of the basic authentication below, make sure that Airflow is
# configured also with the basic_auth as backend additionally to regular session backend needed
# by the UI. In the `[api]` section of your `airflow.cfg` set:
#
# auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
#
# Make sure that your user/name are configured properly - using the user/password that has admin
# privileges in Airflow

# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
    host="http://localhost:8080/api/v1",
    username='admin',
    password='admin'
)

# Make sure in the [core] section, the  `load_examples` config is set to True in your airflow.cfg
# or AIRFLOW__CORE__LOAD_EXAMPLES environment variable set to True
DAG_ID = "example_bash_operator"

# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:

    errors = False

    print('[blue]Getting DAG list')
    dag_api_instance = dag_api.DAGApi(api_client)
    try:
        api_response = dag_api_instance.get_dags()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_dags: %s\n" % e)
        errors = True
    else:
        print('[green]Getting DAG list successful')


    print('[blue]Getting Tasks for a DAG')
    try:
        api_response = dag_api_instance.get_tasks(DAG_ID)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_tasks: %s\n" % e)
        errors = True
    else:
        print('[green]Getting Tasks successful')


    print('[blue]Triggering a DAG run')
    dag_run_api_instance = dag_run_api.DAGRunApi(api_client)
    try:
        # Create a DAGRun object (no dag_id should be specified because it is read-only property of DAGRun)
        # dag_run id is generated randomly to allow multiple executions of the script
        dag_run = DAGRun(
            dag_run_id='some_test_run_' + uuid.uuid4().hex,
        )
        api_response = dag_run_api_instance.post_dag_run(DAG_ID, dag_run)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DAGRunAPI->post_dag_run: %s\n" % e)
        errors = True
    else:
        print('[green]Posting DAG Run successful')

    # Get current configuration. Note, this is disabled by default with most installation.
    # You need to set `expose_config = True` in Airflow configuration in order to retrieve configuration.
    conf_api_instance = config_api.ConfigApi(api_client)
    try:
        api_response = conf_api_instance.get_config()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling ConfigApi->get_config: %s\n" % e)
        errors = True
    else:
        print('[green]Config retrieved successfully')

    if errors:
        print ('\n[red]There were errors while running the script - see above for details')
    else:
        print ('\n[green]Everything went well')

See README for full client API documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-client-2.7.0rc1.tar.gz (372.6 kB view details)

Uploaded Source

Built Distribution

apache_airflow_client-2.7.0rc1-py3-none-any.whl (2.2 MB view details)

Uploaded Python 3

File details

Details for the file apache-airflow-client-2.7.0rc1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-client-2.7.0rc1.tar.gz
Algorithm Hash digest
SHA256 a4bb238ac6c3e8f700b67769976e4540c6336d56b25acf440734f4695522246e
MD5 a63bf1ae947707975ecf3462890b4a69
BLAKE2b-256 c1e9e07c8e1e48c54df012b07df7e0989cf96f2fa3df15b8249a37141b877570

See more details on using hashes here.

File details

Details for the file apache_airflow_client-2.7.0rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_client-2.7.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 24dd5dbc099605b9869849fb149c3241afd9628699ab805ff11401c183f4b5d9
MD5 df9a5ca731978d507f082a3dc60dd479
BLAKE2b-256 f43925d4fc80b6045fec897ed56919bd204a793f808376155218873c36d410c7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page