Skip to main content

Apache Mesos Provider

Project description

Provider for Apache Airflow 2.x to schedule Apache Mesos

Docs Chat Docs

This provider for Apache Airflow contain the following features:

  • MesosExecuter - A scheduler to run Airflow DAG's on mesos
  • MesosOperator - To executer Airflow tasks on mesos. (TODO)

Issues

To open an issue, please use this place: https://github.com/m3scluster/airflow-provider-mesos/issues

Requirements

  • Airflow 2.x
  • Apache Mesos minimum 1.6.x

How to install and configure

On the Airflow Server, we have to install the mesos provider.

pip install avmesos_airflow_provider

Then we will configure Airflow.

vim airflow.cfg

executor = avmesos_airflow_provider.executors.mesos_executor.MesosExecutor

[mesos]
mesos_ssl = True
master = leader.mesos:5050
framework_name = Airflow
checkpoint = True
attributes = False
failover_timeout = 604800
command_shell = True
task_cpu = 1
task_memory = 20000
authenticate = True
default_principal = <MESOS USER>
default_secret = <MESOS PASSWORD>
docker_image_slave = <AIRFLOW DOCKER IMAGE>
docker_volume_driver = local
docker_volume_dag_name = airflowdags
docker_volume_dag_container_path = /home/airflow/airflow/dags/
docker_sock = /var/run/docker.sock
docker_volume_logs_name = airflowlogs
docker_volume_logs_container_path = /home/airflow/airflow/logs/
docker_environment = '[{ "name":"<KEY>", "value":"<VALUE>" }, { ... }]'
api_username = <USERNAME FOR THIS API>
api_password = <PASSWORD FOR THIS API>

DAG example with mesos executor

from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.docker.operators.docker import DockerOperator
from airflow.operators.python import PythonOperator

default_args = {
        'owner'                 : 'airflow',
        'description'           : 'Use of the DockerOperator',
        'depend_on_past'        : True,
}

with DAG('docker_dag2', default_args=default_args, schedule_interval="*/10 * * * * ", catchup=True, start_date=datetime.now()) as dag:
        t2 = DockerOperator(
                task_id='docker_command',
                image='centos:latest',
                api_version='auto',
                auto_remove=False,
                command="/bin/sleep 600",
                docker_url='unix:///var/run/docker.sock',
                executor_config={
                                "cpus": 2.0,
                                "mem_limit": 2048
                }         
        )

        t2

Development

For development and testing we deliver a nix-shell file to install airflow, our airflow provider and postgresql. To use it, please follow the following steps:

  1. Run mesos-mini:
docker run --rm --name mesos --privileged=true --shm-size=30gb -it --net host avhost/mesos-mini:1.11.0-0.2.0-1 /lib/systemd/systemd
  1. Use nix-shell:
nix-shell

> airflow scheduler
  1. On the mesos-ui (http://localhost:5050) you will see Airflow as framework.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

avmesos_airflow_provider-0.2.5.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

avmesos_airflow_provider-0.2.5-py3-none-any.whl (14.7 kB view details)

Uploaded Python 3

File details

Details for the file avmesos_airflow_provider-0.2.5.tar.gz.

File metadata

File hashes

Hashes for avmesos_airflow_provider-0.2.5.tar.gz
Algorithm Hash digest
SHA256 79b71625d454f951188d20409f5b17ecf4e7fb0b33d47cb5b2583bf5083993c8
MD5 4594aecb9f1082e5a6787ae198f05247
BLAKE2b-256 0cf08ff2f8bfe331cb78f32a6a5a3446a0765a994302457526012f23d9019969

See more details on using hashes here.

File details

Details for the file avmesos_airflow_provider-0.2.5-py3-none-any.whl.

File metadata

File hashes

Hashes for avmesos_airflow_provider-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 d3791cfc058605f9f2206107bf3a13c4866f998704d9d21cb0584a06ea2b0ec8
MD5 99da84017489f56f699c93f1f02e168a
BLAKE2b-256 7ccaec8354156455cec4300bb7690eb29499f9b82e4a6d1129ebe3776a2e54c8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page