An Apache Airflow provider for Great Expectations
Apache Airflow Provider for Great Expectations
This is an experimental library as of March 2021! The Great Expectations core team maintains this provider in an experimental state and does not guarantee ongoing support yet.
An Airflow operator for Great Expectations, a Python library for testing and validating data.
Notes on compatibility
- This operator has been updated to use Great Expectations Checkpoints instead of the former ValidationOperators. Therefore, it requires Great Expectations >=v0.13.9, which is pinned in the requirements.txt starting with release 0.0.5.
- Great Expectations version 0.13.8 unfortunately contained a bug that would make this operator not work.
- Great Expectations version 0.13.7 and below will work with version 0.0.4 of this operator and below.
This package has been most recently tested with Airflow 2.0 and Great Expectations v0.13.7.
Pre-requisites: An environment running
apache-airflow- these are requirements of this package that will be installed as dependencies.
pip install airflow-provider-great-expectations
In order to run the
BigQueryOperator, you will also need to install the relevant dependencies:
Depending on your use-case, you might need to add
ENV AIRFLOW__CORE__ENABLE_XCOM_PICKLING=true to your Dockerfile to enable XCOM to pass data between tasks.
Great Expectations Operator: A base operator for Great Expectations. Import into your DAG via:
from great_expectations_provider.operators.great_expectations import GreatExpectationsOperator
Great Expectations BigQuery Operator: An operator for Great Expectations that provides some pre-set parameters for a BigQuery Datasource and Expectation, Validation, and Data Docs stores in Google Cloud Storage. The operator can also be configured to send email on validation failure. See the docstrings in the class for more configuration options. Import into your DAG via:
from great_expectations_provider.operators.great_expectations_bigquery import GreatExpectationsBigQueryOperator
See the example_dags directory for an example DAG with some sample tasks that demonstrate operator functionality. The example DAG file contains a comment with instructions on how to run the examples.
Note that to make these operators work, you will need to change the value of
true in your airflow.cfg.
These examples can be tested in one of two ways:
With the open-source Astro CLI:
- Initialize a project with the Astro CLI
- Copy the example DAG into the
dags/folder of your astro project
- Add the following env var to your
Dockerfileto enable xcom pickling:
- Copy the directories in the
includefolder of this repository into the
includedirectory of your Astro project
astro dev startto view the DAG on a local Airflow instance (you will need Docker running)
With a vanilla Airflow installation:
- Add the example DAG to your
- Make the
include/available in your environment.
- Change the
ge_root_dirpaths in your DAG file to point to the appropriate places.
- Change the paths in
great-expectations/checkpoints/*.ymlto point to the absolute path of your data files.
**This operator is in early stages of development! Feel free to submit issues, PRs, or join the #integration-airflow channel in the Great Expectations Slack for feedback. Thanks to Pete DeJoy and the Astronomer.io team for the support.
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size airflow_provider_great_expectations-0.0.6-py3-none-any.whl (15.1 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size airflow-provider-great-expectations-0.0.6.tar.gz (9.0 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for airflow_provider_great_expectations-0.0.6-py3-none-any.whl
Hashes for airflow-provider-great-expectations-0.0.6.tar.gz