Skip to main content

OpenLineage integration with Airflow

Project description

OpenLineage Airflow Integration

A library that integrates Airflow DAGs with OpenLineage for automatic metadata collection.

Features

Metadata

  • Task lifecycle
  • Task parameters
  • Task runs linked to versioned code
  • Task inputs / outputs

Lineage

  • Track inter-DAG dependencies

Built-in

  • SQL parser
  • Link to code builder (ex: GitHub)
  • Metadata extractors

Requirements

Installation

$ pip3 install openlineage-airflow

Note: You can also add openlineage-airflow to your requirements.txt for Airflow.

To install from source, run:

$ python3 setup.py install

Setup

Airflow 2.3+

The integration automatically registers itself for Airflow 2.3 if it's installed on the Airflow worker's Python. This means you don't have to do anything besides configuring it, which is described in the Configuration section.

Airflow 2.1 - 2.2

This method has limited support: it does not support tracking failed jobs, and job starts are registered only when a job ends.

Set your LineageBackend in your airflow.cfg or via environmental variable AIRFLOW__LINEAGE__BACKEND to

openlineage.lineage_backend.OpenLineageBackend

In contrast to integration via subclassing a DAG, a LineageBackend-based approach collects all metadata for a task on each task's completion.

The OpenLineageBackend does not take into account manually configured inlets and outlets.

When enabled, the library will:

  1. On DAG start, collect metadata for each task using an Extractor if it exists for a given operator.
  2. Collect task input / output metadata (source, schema, etc.)
  3. Collect task run-level metadata (execution time, state, parameters, etc.)
  4. On DAG complete, also mark the task as complete in OpenLineage

Configuration

HTTP Backend Environment Variables

openlineage-airflow uses the OpenLineage client to push data to OpenLineage backend.

The OpenLineage client depends on environment variables:

  • OPENLINEAGE_URL - point to the service that will consume OpenLineage events.
  • OPENLINEAGE_API_KEY - set if the consumer of OpenLineage events requires a Bearer authentication key.
  • OPENLINEAGE_NAMESPACE - set if you are using something other than the default namespace for the job namespace.
  • OPENLINEAGE_AIRFLOW_DISABLE_SOURCE_CODE - set to False if you want the source code of callables provided in the PythonOperator to be sent in OpenLineage events.

For backwards compatibility, openlineage-airflow also supports configuration via MARQUEZ_URL, MARQUEZ_NAMESPACE and MARQUEZ_API_KEY variables.

MARQUEZ_URL=http://my_hosted_marquez.example.com:5000
MARQUEZ_NAMESPACE=my_special_ns

Extractors : Sending the correct data from your DAGs

If you do nothing, the OpenLineage backend will receive the Job and the Run from your DAGs, but, unless you use one of the few operators for which this integration provides an extractor, input and output metadata will not be sent.

openlineage-airflow allows you to do more than that by building "Extractors." An extractor is an object suited to extract metadata from a particular operator (or operators).

  1. Name : The name of the task
  2. Inputs : A list of input datasets
  3. Outputs : A list of output datasets
  4. Context : The Airflow context for the task

Bundled Extractors

openlineage-airflow provides extractors for:

  • PostgresOperator
  • MySqlOperator
  • AthenaOperator
  • BigQueryOperator
  • SnowflakeOperator
  • TrinoOperator
  • GreatExpectationsOperator
  • SFTPOperator
  • FTPFileTransmitOperator
  • PythonOperator
  • RedshiftDataOperator, RedshiftSQLOperator
  • SageMakerProcessingOperator, SageMakerProcessingOperatorAsync
  • SageMakerTrainingOperator, SageMakerTrainingOperatorAsync
  • SageMakerTransformOperator, SageMakerTransformOperatorAsync
  • S3CopyObjectExtractor, S3FileTransformExtractor
  • GCSToGCSOperator
  • DbtCloudRunJobOperator

SQL Operators utilize the SQL parser. There is an experimental SQL parser activated if you install openlineage-sql on your Airflow worker.

Custom Extractors

If your DAGs contain additional operators from which you want to extract lineage data, fear not - you can always provide custom extractors. They should derive from BaseExtractor.

There are two ways to register them for use in openlineage-airflow.

One way is to add them to the OPENLINEAGE_EXTRACTORS environment variable, separated by a semi-colon (;).

OPENLINEAGE_EXTRACTORS=full.path.to.ExtractorClass;full.path.to.AnotherExtractorClass

To ensure OpenLineage logging propagation to custom extractors you should use self.log instead of creating a logger yourself.

Default Extractor

When you own operators' code this is not neccessary to provide custom extractors. You can also use Default Extractor's capability.

In order to do that you should define at least one of two methods in operator:

  • get_openlineage_facets_on_start()

Extracts metadata on start of task.

  • get_openlineage_facets_on_complete(task_instance: TaskInstance)

Extracts metadata on complete of task. This should accept task_instance argument, similar to extract_on_complete method in base extractors.

If you don't define get_openlineage_facets_on_complete method it would fall back to get_openlineage_facets_on_start.

Great Expectations

The Great Expectations integration works by providing an OpenLineageValidationAction. You need to include it into your action_list in great_expectations.yml.

The following example illustrates a way to change the default configuration:

validation_operators:
  action_list_operator:
    # To learn how to configure sending Slack notifications during evaluation
    # (and other customizations), read: https://docs.greatexpectations.io/en/latest/autoapi/great_expectations/validation_operators/index.html#great_expectations.validation_operators.ActionListValidationOperator
    class_name: ActionListValidationOperator
    action_list:
      - name: store_validation_result
        action:
          class_name: StoreValidationResultAction
      - name: store_evaluation_params
        action:
          class_name: StoreEvaluationParametersAction
      - name: update_data_docs
        action:
          class_name: UpdateDataDocsAction
+     - name: openlineage
+       action:
+         class_name: OpenLineageValidationAction
+         module_name: openlineage.common.provider.great_expectations.action
      # - name: send_slack_notification_on_validation_result
      #   action:
      #     class_name: SlackNotificationAction
      #     # put the actual webhook URL in the uncommitted/config_variables.yml file
      #     slack_webhook: ${validation_notification_slack_webhook}
      #     notify_on: all # possible values: "all", "failure", "success"
      #     renderer:
      #       module_name: great_expectations.render.renderer.slack_renderer
      #       class_name: SlackRenderer

If you're using GreatExpectationsOperator, you need to set validation_operator_name to an operator that includes OpenLineageValidationAction. Setting it in great_expectations.yml files isn't enough - the operator overrides it with the default name if a different one is not provided.

To see an example of a working configuration, see DAG and Great Expectations configuration in the integration tests.

Triggering Child Jobs

Commonly, Airflow DAGs will trigger processes on remote systems, such as an Apache Spark or Apache Beam job. Those systems may have their own OpenLineage integrations and report their own job runs and dataset inputs/outputs. To propagate the job hierarchy, tasks must send their own run ids so that the downstream process can report the ParentRunFacet with the proper run id.

The lineage_run_id and lineage_parent_id macros exists to inject the run id or whole parent run information of a given task into the arguments sent to a remote processing job's Airflow operator. The macro requires the DAG run_id and the task to access the generated run_id for that task. For example, a Spark job can be triggered using the DataProcPySparkOperator with the correct parent run_id using the following configuration:

t1 = DataProcPySparkOperator(
    task_id=job_name,
    #required pyspark configuration,
    job_name=job_name,
    dataproc_pyspark_properties={
        'spark.driver.extraJavaOptions':
            f"-javaagent:{jar}={os.environ.get('OPENLINEAGE_URL')}/api/v1/namespaces/{os.getenv('OPENLINEAGE_NAMESPACE', 'default')}/jobs/{job_name}/runs/{{{{macros.OpenLineagePlugin.lineage_run_id(task, task_instance)}}}}?api_key={os.environ.get('OPENLINEAGE_API_KEY')}"
        dag=dag)

Secrets redaction

The integration uses Airflow SecretsMasker to hide secrets from produced metadata events. As not all fields in the metadata should be redacted, RedactMixin is used to pass information about which fields should be ignored by the process.

Typically, you should subclass RedactMixin and use the _skip_redact attribute as a list of names of fields to be skipped.

However, all facets inheriting from BaseFacet should use the _additional_skip_redact attribute as an addition to the regular list (['_producer', '_schemaURL']).

Development

To install all dependencies for local development:

The Airflow integration depends on openlineage.sql, openlineage.common and openlineage.client.python. You should install them first independently or try to install them with following command:

$ pip install -r dev-requirements.txt

There is also a bash script that can run an arbitrary Airflow image with an OpenLineage integration built from the current branch. Additionally, it mounts OpenLineage Python packages as Docker volumes. This enables you to change your code without the need to constantly rebuild Docker images to run tests. Run it as:

$ AIRFLOW_IMAGE=<airflow_image_with_tag> ./scripts/run-dev-airflow.sh [--help]

Unit tests

To run the entire unit test suite, use the below command:

$ tox

or choose one of the environments, e.g.:

$ tox -e py-airflow214

You can also skip using tox and run pytest on your own dev environment.

Integration tests

The integration tests require the use of docker compose. There are scripts prepared to make build images and run tests easier.

$ AIRFLOW_IMAGE=<name-of-airflow-image> ./tests/integration/docker/up.sh
$ AIRFLOW_IMAGE=apache/airflow:2.3.1-python3.7 ./tests/integration/docker/up.sh

When using run-dev-airflow.sh, you can add the -i flag or --attach-integration flag to run integration tests in a dev environment. This can be helpful when you need to run arbitrary integration tests during development. For example, the following command run in the integration container...

python -m pytest test_integration.py::test_integration[great_expectations_validation-requests/great_expectations.json]

...runs a single test which you can repeat after changes in code.


SPDX-License-Identifier: Apache-2.0
Copyright 2018-2023 contributors to the OpenLineage project

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openlineage-airflow-0.24.0.tar.gz (43.4 kB view details)

Uploaded Source

Built Distribution

openlineage_airflow-0.24.0-py3-none-any.whl (56.8 kB view details)

Uploaded Python 3

File details

Details for the file openlineage-airflow-0.24.0.tar.gz.

File metadata

  • Download URL: openlineage-airflow-0.24.0.tar.gz
  • Upload date:
  • Size: 43.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for openlineage-airflow-0.24.0.tar.gz
Algorithm Hash digest
SHA256 75101a22cf54cb5dd2517fadd070095b4bd641b948c7ba89a9cc261b06205e3d
MD5 88ef1180cfd5f8d722a37686e2a6d34a
BLAKE2b-256 ab0bfea151d3b38928b5cc901a44be1d1ac7c85682e2a6d136f4fbf5951925ae

See more details on using hashes here.

File details

Details for the file openlineage_airflow-0.24.0-py3-none-any.whl.

File metadata

File hashes

Hashes for openlineage_airflow-0.24.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c0fd7f9be9668926af5af3a61ce4d94b380e1ac0d9f82aa4ee45b294d8038a19
MD5 d1c90345ef6d4a676a06d1da9ea69579
BLAKE2b-256 f60da7ec40149c893a9954b1a87a91ef8ea914ec42475850d2298826839852ac

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page