Skip to main content

Opinionated framework based on Airflow 2.0 for building pipelines to ingest data into a BigQuery data warehouse

Project description

gcp-airflow-foundations

PyPI version Cloud Build Status Documentation Status

airflow

Airflow is an awesome open source orchestration framework that is the go-to for building data ingestion pipelines on GCP (using Composer - a hosted AIrflow service). However, most companies using it face the same set of problems

  • Learning curve: Airflow requires python knowledge and has some gotchas that take time to learn. Further, writing Python DAGs for every single table that needs to get ingested becomes cumbersome. Most companies end up building utilities for creating DAGs out of configuration files to simplify DAG creation and to allow non-developers to configure ingestion
  • Datalake and data pipelines design best practices: Airflow only provides the building blocks, users are still required to understand and implement the nuances of building a proper ingestion pipelines for the data lake/data warehouse platform they are using
  • Core reusability and best practice enforcement across the enterprise: Usually each team maintains its own Airflow source code and deployment

We have written an opinionated yet flexible ingestion framework for building an ingestion pipeline into data warehouse in BigQuery that supports the following features:

  • Zero-code, config file based ingestion - anybody can start ingesting from the growing number of sources by just providing a simple configuration file. Zero python or Airflow knowledge is required.
  • Modular and extendable - The core of the framework is a lightweight library. Ingestion sources are added as plugins. Adding a new source can be done by extending the provided base classes.
  • Opinionated automatic creation of ODS (Operational Data Store ) and HDS (Historical Data Store) in BigQuery while enforcing best practices such as schema migration, data quality validation, idempotency, partitioning, etc.
  • Dataflow job support for ingesting large datasets from SQL sources and deploying jobs into a specific network or shared VPC.
  • Support of advanced Airflow features for job prioritization such as slots and priorities.
  • Integration with GCP data services such as DLP and Data Catalog [work in progress].
  • Well tested - We maintain a rich suite of both unit and integration tests.

Installing from PyPI

pip install 'gcp-airflow-foundations'

Full Documentation

See the gcp-airflow-foundations documentation for more details.

Running locally

Sample DAGs

Sample DAGs that ingest publicly available GCS files can be found in the dags folder, and are started as soon Airflow is ran locally. In order to have them successfully run please ensure the following:

  • Enable: BigQuery, Cloud Storage, Cloud DLP, Data Catalog API's
  • Create a BigQuery Dataset for the HDS and ODS
  • Create a DLP Inspect template in DLP
  • Create a policy tag in Data Catalog
  • Update the gcp_project, location, dataset values, dlp config and policytag configs with your newly created values

Using Service Account

  • Create a service account in GCP, and save it as helpers/key/keys.json (don't worry, it is in .gitignore, and will not be push to the git repo)
  • Run Airflow locally (Airflow UI will be accessible at http://localhost:8080): docker-compose up
  • Default authentication values for the Airflow UI are provided in lines 96, 97 of docker-composer.yaml

Using user IAM

  • uncomment line 11 in docker-composer.yaml
  • send env var PROJECT_ID to your test project
  • Authorize gcloud to access the Cloud Platform with Google user credentials: helpers/scripts/gcp-auth.sh
  • Run Airflow locally (Airflow UI will be accessible at http://localhost:8080): docker-compose up
  • Default authentication values for the Airflow UI are provided in lines 96, 97 of docker-composer.yaml

Running tests

  • Run unit tests ./tests/airflow "pytest tests/unit
  • Run unit tests with coverage report ./tests/airflow "pytest --cov=gcp_airflow_foundations tests/unit
  • Run integration tests ./tests/airflow "pytest tests/integration
  • Rebuild docker image if requirements changed: docker-compose build

Contributing

Install pre-commit hook

Install pre-commit hooks for linting, format checking, etc.

  • Install pre-commit python lib locally pip install pre-commit
  • Install the pre-commit hooks for the repopre-commit install

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gcp-airflow-foundations-0.3.7.tar.gz (78.5 kB view details)

Uploaded Source

File details

Details for the file gcp-airflow-foundations-0.3.7.tar.gz.

File metadata

File hashes

Hashes for gcp-airflow-foundations-0.3.7.tar.gz
Algorithm Hash digest
SHA256 95d0001a5984bf6485c2737f8dece320c1555756155c3922699f71786efa1c29
MD5 38e0f851a2fbf63b34518a50f826233f
BLAKE2b-256 a5badc75962c18964f4e98fe36178b68d22758c504e35493f7e4a8350e0a18e7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page