Skip to main content

FastETL custom package Apache Airflow provider.

Project description

FastETL's logo. It's a Swiss army knife with some open tools

FastETL framework, modern, versatile, does almost everything.

Este texto também está disponível em português: 🇧🇷LEIAME.md.


CI Tests

FastETL is a plugins package for Airflow for building data pipelines for a number of common scenarios.

Main features:

  • Full or incremental replication of tables in SQL Server, Postgres and MySQL databases
  • Load data from GSheets and from spreadsheets on Samba/Windows networks
  • Extracting CSV from SQL
  • Clean data using custom data patching tasks (e.g. for messy geographical coordinates, mapping canonical values for columns, etc.)
  • Using a Open Street Routing Machine service to calculate route distances
  • Using CKAN or dados.gov.br's API to update dataset metadata
  • Using Frictionless Tabular Data Packages to write OpenDocument Text format data dictionaries

This framework is maintained by a network of developers from many teams at the Ministry of Management and Innovation in Public Services and is the cumulative result of using Apache Airflow, a free and open source tool, starting in 2019.

For government: FastETL is widely used for replication of data queried via Quartzo (DaaS) from Serpro.

Installation in Airflow

FastETL implements the standards for Airflow plugins. To install it, simply add the apache-airflow-providers-fastetl package to your Python dependencies in your Airflow environment.

Or install it with

pip install apache-airflow-providers-fastetl

To see an example of an Apache Airflow container that uses FastETL, check out the airflow2-docker repository.

To ensure appropriate results, please make sure to install the msodbcsql17 and unixodbc-dev libraries on your Apache Airflow workers.

Tests

The test suite uses Docker containers to simulate a complete use environment, including Airflow and the databases. For that reason, to execute the tests, you first need to install Docker and docker-compose.

For instructions on how to do this, see the official Docker documentation.

To build the containers:

make setup

To run the tests, use:

make setup && make tests

To shutdown the environment, use:

make down

Usage examples

The main FastETL feature is the DbToDbOperator operator. It copies data between postgres and mssql databases. MySQL is also supported as a source.

Here goes an example:

from datetime import datetime
from airflow import DAG
from fastetl.operators.db_to_db_operator import DbToDbOperator

default_args = {
    "start_date": datetime(2023, 4, 1),
}

dag = DAG(
    "copy_db_to_db_example",
    default_args=default_args,
    schedule_interval=None,
)


t0 = DbToDbOperator(
    task_id="copy_data",
    source={
        "conn_id": airflow_source_conn_id,
        "schema": source_schema,
        "table": table_name,
    },
    destination={
        "conn_id": airflow_dest_conn_id,
        "schema": dest_schema,
        "table": table_name,
    },
    destination_truncate=True,
    copy_table_comments=True,
    chunksize=10000,
    dag=dag,
)

More detail about the parameters and the workings of DbToDbOperator can bee seen on the following files:

How to contribute

To be written on the CONTRIBUTING.md document (issue #4).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

apache_airflow_providers_fastetl-0.0.43.tar.gz (78.0 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file apache_airflow_providers_fastetl-0.0.43.tar.gz.

File metadata

File hashes

Hashes for apache_airflow_providers_fastetl-0.0.43.tar.gz
Algorithm Hash digest
SHA256 e8c2e14e6c6e8cc6771c8e802c80318fe2e266cca720a9a3844aabc2db477ff3
MD5 98f358e50fa690016b6319aa33151274
BLAKE2b-256 bd0a2251c4eaef866f84a17171caf160c0c4bf247ed445acd4202948e4e5c7eb

See more details on using hashes here.

Provenance

The following attestation bundles were made for apache_airflow_providers_fastetl-0.0.43.tar.gz:

Publisher: build-and-publish.yml on gestaogovbr/FastETL

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file apache_airflow_providers_fastetl-0.0.43-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_fastetl-0.0.43-py3-none-any.whl
Algorithm Hash digest
SHA256 471f4ae5177cfd710b704e60a3462e84606e6e04006dbd7dbe59b5ed30f2cce7
MD5 8c274f9dc8fa5f2b92d767b55d15c153
BLAKE2b-256 8dbfe7da8433e1a6bf23c64fc52b928f00922a121475121c70cfe4175375b1ab

See more details on using hashes here.

Provenance

The following attestation bundles were made for apache_airflow_providers_fastetl-0.0.43-py3-none-any.whl:

Publisher: build-and-publish.yml on gestaogovbr/FastETL

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page