Skip to main content

Datahub Airflow plugin to capture executions and send to Datahub

Project description

Datahub Airflow Plugin

Capabilities

DataHub supports integration of

  • Airflow Pipeline (DAG) metadata
  • DAG and Task run information
  • Lineage information when present

Installation

  1. You need to install the required dependency in your airflow.
  pip install acryl-datahub-airflow-plugin

::: note

We recommend you use the lineage plugin if you are on Airflow version >= 2.0.2 or on MWAA with an Airflow version >= 2.0.2 :::

  1. Disable lazy plugin load in your airflow.cfg
core.lazy_load_plugins : False
  1. You must configure an Airflow hook for Datahub. We support both a Datahub REST hook and a Kafka-based hook, but you only need one.

    # For REST-based:
    airflow connections add  --conn-type 'datahub_rest' 'datahub_rest_default' --conn-host 'http://localhost:8080'
    # For Kafka-based (standard Kafka sink config can be passed via extras):
    airflow connections add  --conn-type 'datahub_kafka' 'datahub_kafka_default' --conn-host 'broker:9092' --conn-extra '{}'
    
  2. Add your datahub_conn_id and/or cluster to your airflow.cfg file if it is not align with the default values. See configuration parameters below

    Configuration options:

    Name Default value Description
    datahub.datahub_conn_id datahub_rest_deafault The name of the datahub connection you set in step 1.
    datahub.cluster prod name of the airflow cluster
    datahub.capture_ownership_info true If true, the owners field of the DAG will be capture as a DataHub corpuser.
    datahub.capture_tags_info true If true, the tags field of the DAG will be captured as DataHub tags.
    datahub.graceful_exceptions true If set to true, most runtime errors in the lineage backend will be suppressed and will not cause the overall task to fail. Note that configuration issues will still throw exceptions.
  3. Configure inlets and outlets for your Airflow operators. For reference, look at the sample DAG in lineage_backend_demo.py, or reference lineage_backend_taskflow_demo.py if you're using the TaskFlow API.

  4. [optional] Learn more about Airflow lineage, including shorthand notation and some automation.

How to validate installation

  1. Go and check in Airflow at Admin -> Plugins menu if you can see the Datahub plugin
  2. Run an Airflow DAG and you should see in the task logs Datahub releated log messages like:
Emitting Datahub ...

Additional references

Related Datahub videos: Airflow Lineage Airflow Run History in DataHub

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

acryl-datahub-airflow-plugin-0.8.44.2.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file acryl-datahub-airflow-plugin-0.8.44.2.tar.gz.

File metadata

File hashes

Hashes for acryl-datahub-airflow-plugin-0.8.44.2.tar.gz
Algorithm Hash digest
SHA256 e06f875c1e1dfb13072a0fb2f02410491926a2f5796c00f00fe9dd7d23b3c460
MD5 e68fbd9cd3b1d0e9f4e94ce90e3b84f8
BLAKE2b-256 e249deefe1baa4ff00772991366b81ec157cc3a646bf35954994f543ee6c7fc9

See more details on using hashes here.

File details

Details for the file acryl_datahub_airflow_plugin-0.8.44.2-py3-none-any.whl.

File metadata

File hashes

Hashes for acryl_datahub_airflow_plugin-0.8.44.2-py3-none-any.whl
Algorithm Hash digest
SHA256 4c8321f687f0a9f8ab2ca56a7392b8401794ea4f5f1d609a6cf350458e9d592f
MD5 6bc89a4566feda0f5977c1f568da1a7e
BLAKE2b-256 8e69a0cca23034a46f8e5ce7cf31176223f36909b22a729dd7bf158e63868f0d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page