Skip to main content

Apache Mesos Provider

Project description

Provider for Apache Airflow 2.x to schedule Apache Mesos

Docs Chat Docs

This provider for Apache Airflow contain the following features:

  • MesosExecuter - A scheduler to run Airflow DAG's on mesos
  • MesosOperator - To executer Airflow tasks on mesos. (TODO)

Issues

To open an issue, please use this place: https://github.com/m3scluster/airflow-provider-mesos/issues

Requirements

  • Airflow 2.x
  • Apache Mesos minimum 1.6.x

How to install and configure

On the Airflow Server, we have to install the mesos provider.

pip install avmesos_airflow_provider

Then we will configure Airflow.

vim airflow.cfg

executor = avmesos_airflow_provider.executors.mesos_executor.MesosExecutor

[mesos]
mesos_ssl = True
master = leader.mesos:5050
framework_name = Airflow
checkpoint = True
attributes = False
failover_timeout = 604800
command_shell = True
task_cpu = 1
task_memory = 20000
authenticate = True
default_principal = <MESOS USER>
default_secret = <MESOS PASSWORD>
docker_image_slave = <AIRFLOW DOCKER IMAGE>
docker_volume_driver = local
docker_volume_dag_name = airflowdags
docker_volume_dag_container_path = /home/airflow/airflow/dags/
docker_sock = /var/run/docker.sock
docker_volume_logs_name = airflowlogs
docker_volume_logs_container_path = /home/airflow/airflow/logs/
docker_environment = '[{ "name":"<KEY>", "value":"<VALUE>" }, { ... }]'
api_username = <USERNAME FOR THIS API>
api_password = <PASSWORD FOR THIS API>

DAG example with mesos executor

from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.docker.operators.docker import DockerOperator
from airflow.operators.python import PythonOperator

default_args = {
        'owner'                 : 'airflow',
        'description'           : 'Use of the DockerOperator',
        'depend_on_past'        : True,
}

with DAG('docker_dag2', default_args=default_args, schedule_interval="*/10 * * * * ", catchup=True, start_date=datetime.now()) as dag:
        t2 = DockerOperator(
                task_id='docker_command',
                image='centos:latest',
                api_version='auto',
                auto_remove=False,
                command="/bin/sleep 600",
                docker_url='unix:///var/run/docker.sock',
                executor_config={
                                "cpus": 2.0,
                                "mem_limit": 2048
                }         
        )

        t2

Development

For development and testing we deliver a nix-shell file to install airflow, our airflow provider and postgresql. To use it, please follow the following steps:

  1. Run mesos-mini:
docker run --rm --name mesos --privileged=true --shm-size=30gb -it --net host avhost/mesos-mini:1.11.0-0.2.0-1 /lib/systemd/systemd
  1. Use nix-shell:
nix-shell

> airflow scheduler
  1. On the mesos-ui (http://localhost:5050) you will see Airflow as framework.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

avmesos_airflow_provider-0.2.3.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

avmesos_airflow_provider-0.2.3-py3-none-any.whl (14.6 kB view details)

Uploaded Python 3

File details

Details for the file avmesos_airflow_provider-0.2.3.tar.gz.

File metadata

File hashes

Hashes for avmesos_airflow_provider-0.2.3.tar.gz
Algorithm Hash digest
SHA256 c580caaf8ddb16ef86dd6fc59da83c870bbead330edfd2f42f2d2d0a064be68d
MD5 eccfaab402a636b7dd1cfbfa02eacb00
BLAKE2b-256 e285fea248145502473bf327321cb3d8f32d346b9dcf7885548e017810c98576

See more details on using hashes here.

File details

Details for the file avmesos_airflow_provider-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for avmesos_airflow_provider-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 442fdb6fefdc745c793d57d71dec6ed98d1ff975dcf8426b9e64558d6ed122fc
MD5 adb9f49003f99c3281cf83076a9d2029
BLAKE2b-256 e03db02a9a826b1b8165d66dd67819864b3a8271c574802a8d266332e8d45031

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page