Skip to main content

Apache Airflow API (Stable)

Project description

Apache Airflow Python Client

Requirements.

Python >= 3.7

Installation & Usage

pip install

You can install directly using pip:

pip install apache-airflow-client

Setuptools

Or install via Setuptools.

git clone git@github.com:apache/airflow-client-python.git
cd airflow-client-python
python setup.py install --user

(or sudo python setup.py install to install the package for all users)

Then import the package:

import airflow_client.client

Changelog

See CHANGELOG.md for keeping track on what has changed in the client.

Getting Started

Please follow the installation procedure and then run the following example python script:

import uuid

import airflow_client.client
try:
    # If you have rich installed, you will have nice colored output of the API responses
    from rich import print
except ImportError:
    print("Output will not be colored. Please install rich to get colored output: `pip install rich`")
    pass
from airflow_client.client.api import config_api, dag_api, dag_run_api
from airflow_client.client.model.dag_run import DAGRun

# The client must use the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
#
# In case of the basic authentication below, make sure that Airflow is
# configured also with the basic_auth as backend additionally to regular session backend needed
# by the UI. In the `[api]` section of your `airflow.cfg` set:
#
# auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
#
# Make sure that your user/name are configured properly - using the user/password that has admin
# privileges in Airflow

# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
    host="http://localhost:8080/api/v1",
    username='admin',
    password='admin'
)

# Make sure in the [core] section, the  `load_examples` config is set to True in your airflow.cfg
# or AIRFLOW__CORE__LOAD_EXAMPLES environment variable set to True
DAG_ID = "example_bash_operator"

# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:

    errors = False

    print('[blue]Getting DAG list')
    dag_api_instance = dag_api.DAGApi(api_client)
    try:
        api_response = dag_api_instance.get_dags()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_dags: %s\n" % e)
        errors = True
    else:
        print('[green]Getting DAG list successful')


    print('[blue]Getting Tasks for a DAG')
    try:
        api_response = dag_api_instance.get_tasks(DAG_ID)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_tasks: %s\n" % e)
        errors = True
    else:
        print('[green]Getting Tasks successful')


    print('[blue]Triggering a DAG run')
    dag_run_api_instance = dag_run_api.DAGRunApi(api_client)
    try:
        # Create a DAGRun object (no dag_id should be specified because it is read-only property of DAGRun)
        # dag_run id is generated randomly to allow multiple executions of the script
        dag_run = DAGRun(
            dag_run_id='some_test_run_' + uuid.uuid4().hex,
        )
        api_response = dag_run_api_instance.post_dag_run(DAG_ID, dag_run)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DAGRunAPI->post_dag_run: %s\n" % e)
        errors = True
    else:
        print('[green]Posting DAG Run successful')

    # Get current configuration. Note, this is disabled by default with most installation.
    # You need to set `expose_config = True` in Airflow configuration in order to retrieve configuration.
    conf_api_instance = config_api.ConfigApi(api_client)
    try:
        api_response = conf_api_instance.get_config()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling ConfigApi->get_config: %s\n" % e)
        errors = True
    else:
        print('[green]Config retrieved successfully')

    if errors:
        print ('\n[red]There were errors while running the script - see above for details')
    else:
        print ('\n[green]Everything went well')

See README for full client API documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-client-2.7.3rc1.tar.gz (373.0 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file apache-airflow-client-2.7.3rc1.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-client-2.7.3rc1.tar.gz
Algorithm Hash digest
SHA256 6bd07b3abd27cc3465088c3bc59ad33e6bd6b2434e6a6dc717b42c1f0a945a73
MD5 6dc4a73c9114dae981c5e2d4d429625c
BLAKE2b-256 7fb52e438e35eb44b7098b920bafb1e62ddd582324d51c043c96a19d165d733e

See more details on using hashes here.

File details

Details for the file apache_airflow_client-2.7.3rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_client-2.7.3rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 367c39a1ce2cef34d1e0e76750293d600f91af21c62649987eacbfd9c1d6689a
MD5 d48baeb6fb32526cb123e40bea213d46
BLAKE2b-256 75994c1c41b9ac60a3fd3e30b3e192e7d4faacce6df13ff063aa2fa063c813a1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page