Skip to main content

A convenient add-on for Apache Airflow

Project description

# Aerofoil : Lifting Airflow Aerofoil is an Apache Airflow plugin. It provides some very useful extensions, that are missing in the Airflow. The plugin is available with MIT license, the intention is to keep it simple and friendly.

### Features: ##### Backfill UI: A convenient way to run Backfill DAG from UI. In most of the enterprise setup, DAG developers do not have access to Airflow servers to be able to run backfill jobs. This makes running backfill an expensive operation. With the Backfilll UI, Airflow users can submit backfill command from UI. The backfill itself runs as a Airflow DAG and is compatible with most production Airflow configurations. This is also the most used feature in Aerofoil. ##### My DAGs Large Airflow installations can have hundreds of DAGs. This feature allows logged in user to view to see the DAG’s authored by her/him. The DAG Author must be set correctly in the DAG file and it must match with the Airflow user name. ##### Reset DAG History Sometime complete DAG history needs to be reset. There is currently no way of clearing a DAGs history in Airflow. This functionality allows clearing/resetting DAGs history via a convenient UI.

##### Fake Sucess A shortcut to mark your DAG/Task runs successful. The Fake success is a powerful featurs, and to avoid misuse and Audit Trail is maintained. Adminstratos can always find out, who is trying to fake it.

##### AerofoilBashOperator: Context aware BashOperator, that can take a context and pass it to callback e.g. on_success_callback, on_failure_callback etc.

## Installation ### Pre-requisite - Working Airflow installation. - works with any executor other than Sequential Executor or LocalExecutor (e.g. KubernetasExecutor, CeleryExecutor) - Requires a non-sqlite database in the backend.

### Install Instructions: - pip install apache-airflow-providers-aerofoil - Add rquired tables to Airflow Metadata DB by executing src/aerofoil/models.sql - Deploy DAG file in the src/aerofoil/aerofoil_backfill.py DAGs to your Airflow (usually by putting the DAG in Airflow’s DAGs folder. - Make sure the __aerofoil_backfill__ - DAG is enabled.

## Backfilll Design: When the user submits a backfill, it creates an entry in the aerofoil_backfill backfill table. __aerofoil_backfill__ DAG is scheduled to run every minute (which can be changed, by modifying the DAG). The Sesor in the DAG, picks up the entries in the DB and create dynamic task for each entry. The Backfill job itself runs as a Bash command in the executor. This provides a flexible and executor agnostic design.

## Screenshots: ##### Aerofoil Menu ![img](screenshots/menu.png) ##### Backfill Screen ![img](screenshots/backfill.png) ##### Reset DAG Menu ![img](screenshots/reset.png) ##### Fake Success Menu ![img](screenshots/fake_success.png)

## Contribute: If you find something missing and would like to contrinute, feel free to raise a pull request. If you find a bug and would like me to fix it, please feel free to raise an issue.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Built Distribution

File details

Details for the file apache-airflow-providers-aerofoil2-1.0.0.tar.gz.

File metadata

File hashes

Hashes for apache-airflow-providers-aerofoil2-1.0.0.tar.gz
Algorithm Hash digest
SHA256 971c8c9197bcac20f6aa823cf4287bfe8fdb6b84ed8afcf252ead4a8e388dfc8
MD5 90bddb5c490341038ad0d5ccbf3c5ad0
BLAKE2b-256 f5fb11576515f80986748c30031e9c2fc4eabd3730fe29a42b5a8607fc5d6ead

See more details on using hashes here.

File details

Details for the file apache_airflow_providers_aerofoil2-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for apache_airflow_providers_aerofoil2-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 40f065ed9c84367275911f834c79130e39e6a82abdd4eb00a308da69df595237
MD5 d85a8d9ae4cfdfe4e82bb1cf2988ef61
BLAKE2b-256 9febe20906f7d018a72ce0753bafef07c7c7571327c63e3c882886bbf9555743

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page