Skip to main content

Apache Airflow API (Stable)

Project description

Apache Airflow Python Client

Requirements.

Python >= 3.7

Installation & Usage

pip install

You can install directly using pip:

pip install apache-airflow-client

Setuptools

Or install via Setuptools.

git clone git@github.com:apache/airflow-client-python.git
cd airflow-client-python
python setup.py install --user

(or sudo python setup.py install to install the package for all users)

Then import the package:

import airflow_client.client

Changelog

See CHANGELOG.md for keeping track on what has changed in the client.

Getting Started

Please follow the installation procedure and then run the following example python script:

import uuid

import airflow_client.client
try:
    # If you have rich installed, you will have nice colored output of the API responses
    from rich import print
except ImportError:
    print("Output will not be colored. Please install rich to get colored output: `pip install rich`")
    pass
from airflow_client.client.api import config_api, dag_api, dag_run_api
from airflow_client.client.model.dag_run import DAGRun

# The client must use the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
#
# In case of the basic authentication below, make sure that Airflow is
# configured also with the basic_auth as backend additionally to regular session backend needed
# by the UI. In the `[api]` section of your `airflow.cfg` set:
#
# auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
#
# Make sure that your user/name are configured properly - using the user/password that has admin
# privileges in Airflow

# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
    host="http://localhost:8080/api/v1",
    username='admin',
    password='admin'
)

# Make sure in the [core] section, the  `load_examples` config is set to True in your airflow.cfg
# or AIRFLOW__CORE__LOAD_EXAMPLES environment variable set to True
DAG_ID = "example_bash_operator"

# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:

    errors = False

    print('[blue]Getting DAG list')
    dag_api_instance = dag_api.DAGApi(api_client)
    try:
        api_response = dag_api_instance.get_dags()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_dags: %s\n" % e)
        errors = True
    else:
        print('[green]Getting DAG list successful')


    print('[blue]Getting Tasks for a DAG')
    try:
        api_response = dag_api_instance.get_tasks(DAG_ID)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DagAPI->get_tasks: %s\n" % e)
        errors = True
    else:
        print('[green]Getting Tasks successful')


    print('[blue]Triggering a DAG run')
    dag_run_api_instance = dag_run_api.DAGRunApi(api_client)
    try:
        # Create a DAGRun object (no dag_id should be specified because it is read-only property of DAGRun)
        # dag_run id is generated randomly to allow multiple executions of the script
        dag_run = DAGRun(
            dag_run_id='some_test_run_' + uuid.uuid4().hex,
        )
        api_response = dag_run_api_instance.post_dag_run(DAG_ID, dag_run)
        print(api_response)
    except airflow_client.client.exceptions.OpenApiException as e:
        print("[red]Exception when calling DAGRunAPI->post_dag_run: %s\n" % e)
        errors = True
    else:
        print('[green]Posting DAG Run successful')

    # Get current configuration. Note, this is disabled by default with most installation.
    # You need to set `expose_config = True` in Airflow configuration in order to retrieve configuration.
    conf_api_instance = config_api.ConfigApi(api_client)
    try:
        api_response = conf_api_instance.get_config()
        print(api_response)
    except airflow_client.client.OpenApiException as e:
        print("[red]Exception when calling ConfigApi->get_config: %s\n" % e)
        errors = True
    else:
        print('[green]Config retrieved successfully')

    if errors:
        print ('\n[red]There were errors while running the script - see above for details')
    else:
        print ('\n[green]Everything went well')

See README for full client API documentation.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache-airflow-client-2.6.0rc3.tar.gz (200.9 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file apache-airflow-client-2.6.0rc3.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-client-2.6.0rc3.tar.gz
Algorithm Hash digest
SHA256 8193205be5f49a7c123c9eb24634a94f0b48c005b5c2a8a214ea606a49a1756e
MD5 cc435e86b74ab7a2f0f349ad5b9fe1d4
BLAKE2b-256 9ee96fec2b8148f8d1cf3dab7657fdc9796e53152ff9c792cb0cec1061d284c2

See more details on using hashes here.

File details

Details for the file apache_airflow_client-2.6.0rc3-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_client-2.6.0rc3-py3-none-any.whl
Algorithm Hash digest
SHA256 ddea96a486f9e0794188b179061dfb4aa334b0c2c4ae94af972d3146fac277de
MD5 f1ea11dec17eddea918a4b8879b36b7b
BLAKE2b-256 108f29247f06c0cd032e74a7a2fb55e9640ff551a4cd8b2cec4752ae34aa76dc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page