Apache Airflow API (Stable)
Project description
Apache Airflow Python Client
Requirements.
Python >= 3.7
Installation & Usage
pip install
You can install directly using pip:
pip install apache-airflow-client
Setuptools
Or install via Setuptools.
git clone git@github.com:apache/airflow-client-python.git
cd airflow-client-python
python setup.py install --user
(or sudo python setup.py install
to install the package for all users)
Then import the package:
import airflow_client.client
Changelog
See CHANGELOG.md for keeping track on what has changed in the client.
Getting Started
Please follow the installation procedure and then run the following example python script:
import uuid
import airflow_client.client
try:
# If you have rich installed, you will have nice colored output of the API responses
from rich import print
except ImportError:
print("Output will not be colored. Please install rich to get colored output: `pip install rich`")
pass
from airflow_client.client.api import config_api, dag_api, dag_run_api
from airflow_client.client.model.dag_run import DAGRun
# The client must use the authentication and authorization parameters
# in accordance with the API server security policy.
# Examples for each auth method are provided below, use the example that
# satisfies your auth use case.
#
# In case of the basic authentication below, make sure that Airflow is
# configured also with the basic_auth as backend additionally to regular session backend needed
# by the UI. In the `[api]` section of your `airflow.cfg` set:
#
# auth_backend = airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth
#
# Make sure that your user/name are configured properly - using the user/password that has admin
# privileges in Airflow
# Configure HTTP basic authorization: Basic
configuration = airflow_client.client.Configuration(
host="http://localhost:8080/api/v1",
username='admin',
password='admin'
)
# Make sure in the [core] section, the `load_examples` config is set to True in your airflow.cfg
# or AIRFLOW__CORE__LOAD_EXAMPLES environment variable set to True
DAG_ID = "example_bash_operator"
# Enter a context with an instance of the API client
with airflow_client.client.ApiClient(configuration) as api_client:
errors = False
print('[blue]Getting DAG list')
dag_api_instance = dag_api.DAGApi(api_client)
try:
api_response = dag_api_instance.get_dags()
print(api_response)
except airflow_client.client.OpenApiException as e:
print("[red]Exception when calling DagAPI->get_dags: %s\n" % e)
errors = True
else:
print('[green]Getting DAG list successful')
print('[blue]Getting Tasks for a DAG')
try:
api_response = dag_api_instance.get_tasks(DAG_ID)
print(api_response)
except airflow_client.client.exceptions.OpenApiException as e:
print("[red]Exception when calling DagAPI->get_tasks: %s\n" % e)
errors = True
else:
print('[green]Getting Tasks successful')
print('[blue]Triggering a DAG run')
dag_run_api_instance = dag_run_api.DAGRunApi(api_client)
try:
# Create a DAGRun object (no dag_id should be specified because it is read-only property of DAGRun)
# dag_run id is generated randomly to allow multiple executions of the script
dag_run = DAGRun(
dag_run_id='some_test_run_' + uuid.uuid4().hex,
)
api_response = dag_run_api_instance.post_dag_run(DAG_ID, dag_run)
print(api_response)
except airflow_client.client.exceptions.OpenApiException as e:
print("[red]Exception when calling DAGRunAPI->post_dag_run: %s\n" % e)
errors = True
else:
print('[green]Posting DAG Run successful')
# Get current configuration. Note, this is disabled by default with most installation.
# You need to set `expose_config = True` in Airflow configuration in order to retrieve configuration.
conf_api_instance = config_api.ConfigApi(api_client)
try:
api_response = conf_api_instance.get_config()
print(api_response)
except airflow_client.client.OpenApiException as e:
print("[red]Exception when calling ConfigApi->get_config: %s\n" % e)
errors = True
else:
print('[green]Config retrieved successfully')
if errors:
print ('\n[red]There were errors while running the script - see above for details')
else:
print ('\n[green]Everything went well')
See README for full client API documentation.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
apache-airflow-client-2.7.3.tar.gz
(372.8 kB
view details)
Built Distribution
File details
Details for the file apache-airflow-client-2.7.3.tar.gz
.
File metadata
- Download URL: apache-airflow-client-2.7.3.tar.gz
- Upload date:
- Size: 372.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 57ecec68803d3bc3a8796e873d82fdbfa04910c9790006c42d354a3b441b6024 |
|
MD5 | bafc2abbd5f362d9c2a516d80e80f2e7 |
|
BLAKE2b-256 | 6ae97555617d4e827dadb43f075daf0753e055a9d4afa18b821fba74c102d2b3 |
File details
Details for the file apache_airflow_client-2.7.3-py3-none-any.whl
.
File metadata
- Download URL: apache_airflow_client-2.7.3-py3-none-any.whl
- Upload date:
- Size: 2.2 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 986b1f4b78ff680fc4811ff8be73272f231cbfd40a9390f63f9065880e2a825b |
|
MD5 | 2383db098adb4935f52d673dde271722 |
|
BLAKE2b-256 | afbea8631cdf81efe918bdc47b36d58abb32506e51d2e13db505cb15de0105e0 |