Skip to main content

Machine Learning Orchestration

Project description

Dbnd-Airflow Syncing mechanism

dbnd-airflow-sync (an Airflow plugin)

dbnd-airflow-sync is a plugin for Airflow system, enables you to fetch data from Airflow database and DAG folder. This Airflow side module is one of two components allows you to sync your Airflow data into Databand system.

What does it do?

The plugin exposes a REST Api within GET /export_data which, expects since (utc) and period (int) in minutes. This api returns json with all the relevant information scraped from airflow system.

Installation

In order to install dbnd-airflow-sync we are using Airflow plugin system.

Easy installation (recommended):

Copy the plugin file into airflow plugins folder in you project (Airflow will automatically look for your plugins in this folder when startup)

mkdir $AIRFLOW_HOME/plugins
cp dbnd-airflow-sync/src/dbnd_airflow_export/dbnd_airflow_export_plugin.py $AIRFLOW_HOME/plugins/

Setup tools:

You can also install dbnd-airflow-sync using setup tools.

cd dbnd-airflow-sync
pip install -e .

dbnd-airflow-sync (a Databand module)

dbnd-airflow-sync is a stand-alone module for Databand system, enables you to load data from Airflow server and import it into Databand system. This Databand side module is one of two components allows you to sync your Airflow data into Databand system.

Installation with setup tools

cd modules/dbnd-airflow-sync
pip install -e .

Usage

dbnd airflow-monitor

Configuration

You can configure your syncing variables inside airflow_sync.cfg

[core]
interval = 10 ; Time in seconds to wait between each fetching cycle 
fetcher = web ; Fetch method. Data can be fetched directly from db or through rest api [web\db] 
include_logs = True ; Whether or not to include logs (might be heavy)

# Fetch period in mins
fetch_period = 60 ; Time in minutes for window fetching size (start: since, end: since + period)


[web]
url = http://localhost:8080/admin/data_export_plugin/export_data ; When using fetcher=web, try this url

[db]
sql_alchemy_conn = sqlite:////usr/local/airflow/airflow.db ; When using fetcher=db, use this sql connection string
dag_folder =  /usr/local/airflow/dags ; When using fetcher=db, this is the dag folder location

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for dbnd-airflow-export, version 0.24.25
Filename, size File type Python version Upload date Hashes
Filename, size dbnd_airflow_export-0.24.25-py2.py3-none-any.whl (11.4 kB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size dbnd-airflow-export-0.24.25.tar.gz (12.0 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page