Skip to main content

Apache Airflow API (Stable)

Project description

Apache Airflow Python Client

Requirements.

Python >= 3.7

Installation & Usage

pip install

You can install directly using pip:

pip install apache-airflow-client

Setuptools

Or install via Setuptools.

git clone git@github.com:apache/airflow-client-python.git
cd airflow-client-python
python setup.py install --user

(or sudo python setup.py install to install the package for all users)

Then import the package:

import airflow_client.client

Changelog

See CHANGELOG.md for keeping track on what has changed in the client.

Getting Started

Please follow the installation procedure and then run the following example python script:

import uuid

import airflow_client.client
try:
    # If you have rich installed, you will have nice colored output of the API responses
    from rich import print
except ImportError:
    print("Output will not be colored. Please install rich to get colored output: `pip install rich`")
    pass
from airflow_client.client.api import config_api, dag_api, dag_run_api
from airflow_client.client.model.dag_run import DAGRun

# The client must use the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
#
# In case of the basic authentication below, make sure that Airflow is
# configured also with the basic_auth as backend additionally to regular session backend needed
# by the UI. In the `[api]` section of your `airflow.cfg` set:
#
# auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
#
# Make sure that your user/name are configured properly - using the user/password that has admin
# privileges in Airflow

# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
    host="http://localhost:8080/api/v1",
    username='admin',
    password='admin'
)

# Make sure in the [core] section, the  `load_examples` config is set to True in your airflow.cfg
# or AIRFLOW__CORE__LOAD_EXAMPLES environment variable set to True
DAG_ID = "example_bash_operator"

# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:

    errors = False

    print('[blue]Getting DAG list')
    dag_api_instance = dag_api.DAGApi(api_client)
    try:
        api_response = dag_api_instance.get_dags()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_dags: %s\n" % e)
        errors = True
    else:
        print('[green]Getting DAG list successful')


    print('[blue]Getting Tasks for a DAG')
    try:
        api_response = dag_api_instance.get_tasks(DAG_ID)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_tasks: %s\n" % e)
        errors = True
    else:
        print('[green]Getting Tasks successful')


    print('[blue]Triggering a DAG run')
    dag_run_api_instance = dag_run_api.DAGRunApi(api_client)
    try:
        # Create a DAGRun object (no dag_id should be specified because it is read-only property of DAGRun)
        # dag_run id is generated randomly to allow multiple executions of the script
        dag_run = DAGRun(
            dag_run_id='some_test_run_' + uuid.uuid4().hex,
        )
        api_response = dag_run_api_instance.post_dag_run(DAG_ID, dag_run)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DAGRunAPI->post_dag_run: %s\n" % e)
        errors = True
    else:
        print('[green]Posting DAG Run successful')

    # Get current configuration. Note, this is disabled by default with most installation.
    # You need to set `expose_config = True` in Airflow configuration in order to retrieve configuration.
    conf_api_instance = config_api.ConfigApi(api_client)
    try:
        api_response = conf_api_instance.get_config()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling ConfigApi->get_config: %s\n" % e)
        errors = True
    else:
        print('[green]Config retrieved successfully')

    if errors:
        print ('\n[red]There were errors while running the script - see above for details')
    else:
        print ('\n[green]Everything went well')

See README for full client API documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-client-2.6.2rc2.tar.gz (201.9 kB view details)

Uploaded Source

Built Distribution

apache_airflow_client-2.6.2rc2-py3-none-any.whl (1.4 MB view details)

Uploaded Python 3

File details

Details for the file apache-airflow-client-2.6.2rc2.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-client-2.6.2rc2.tar.gz
Algorithm Hash digest
SHA256 5faf59c62df02e905c42a2633f64563d3ed8a8217b6749337264617b0ba08edf
MD5 07772e015ddf5cafb8355940538a4fb7
BLAKE2b-256 edaee12396c2f50db29f17d497c8ca0d7c1c03f3069ddc68fb089d209233f216

See more details on using hashes here.

File details

Details for the file apache_airflow_client-2.6.2rc2-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_client-2.6.2rc2-py3-none-any.whl
Algorithm Hash digest
SHA256 818fdd375e68e734c4f2737f0c4c7480e1800272f7400419d40e19cc5b43cd24
MD5 84dd1c521afaae14c99dfdd6e035e9c2
BLAKE2b-256 30c4e8f435fd725d5643464a9d0e2d58eae7c015f712250d587d3b02e6c3a1a0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page