Skip to main content

Apache Mesos Provider

Project description

Provider for Apache Airflow 2.x to schedule Apache Mesos

Docs Chat Docs

This provider for Apache Airflow contain the following features:

  • MesosExecuter - A scheduler to run Airflow DAG's on mesos
  • MesosOperator - To executer Airflow tasks on mesos. (TODO)

Issues

To open an issue, please use this place: https://github.com/m3scluster/airflow-provider-mesos/issues

Requirements

  • Airflow 2.x
  • Apache Mesos minimum 1.6.x

How to install and configure

On the Airflow Server, we have to install the mesos provider.

pip install avmesos_airflow_provider

Then we will configure Airflow.

vim airflow.cfg

executor = avmesos_airflow_provider.executors.mesos_executor.MesosExecutor

[mesos]
mesos_ssl = True
master = leader.mesos:5050
framework_name = Airflow
checkpoint = True
attributes = False
failover_timeout = 604800
command_shell = True
task_cpu = 1
task_memory = 20000
authenticate = True
default_principal = <MESOS USER>
default_secret = <MESOS PASSWORD>
docker_image_slave = <AIRFLOW DOCKER IMAGE>
docker_volume_driver = local
docker_volume_dag_name = airflowdags
docker_volume_dag_container_path = /home/airflow/airflow/dags/
docker_sock = /var/run/docker.sock
docker_volume_logs_name = airflowlogs
docker_volume_logs_container_path = /home/airflow/airflow/logs/
docker_environment = '[{ "name":"<KEY>", "value":"<VALUE>" }, { ... }]'
api_username = <USERNAME FOR THIS API>
api_password = <PASSWORD FOR THIS API>

DAG example with mesos executor

from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.docker.operators.docker import DockerOperator
from airflow.operators.python import PythonOperator

default_args = {
        'owner'                 : 'airflow',
        'description'           : 'Use of the DockerOperator',
        'depend_on_past'        : True,
}

with DAG('docker_dag2', default_args=default_args, schedule_interval="*/10 * * * * ", catchup=True, start_date=datetime.now()) as dag:
        t2 = DockerOperator(
                task_id='docker_command',
                image='centos:latest',
                api_version='auto',
                auto_remove=False,
                command="/bin/sleep 600",
                docker_url='unix:///var/run/docker.sock',
                executor_config={
                                "cpus": 2.0,
                                "mem_limit": 2048
                }         
        )

        t2

Development

For development and testing we deliver a nix-shell file to install airflow, our airflow provider and postgresql. To use it, please follow the following steps:

  1. Run mesos-mini:
docker run --rm --name mesos --privileged=true --shm-size=30gb -it --net host avhost/mesos-mini:1.11.0-0.2.0-1 /lib/systemd/systemd
  1. Use nix-shell:
nix-shell

> airflow scheduler
  1. On the mesos-ui (http://localhost:5050) you will see Airflow as framework.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

avmesos_airflow_provider-0.2.7.tar.gz (1.2 MB view details)

Uploaded Source

Built Distribution

avmesos_airflow_provider-0.2.7-py3-none-any.whl (14.8 kB view details)

Uploaded Python 3

File details

Details for the file avmesos_airflow_provider-0.2.7.tar.gz.

File metadata

File hashes

Hashes for avmesos_airflow_provider-0.2.7.tar.gz
Algorithm Hash digest
SHA256 c8e7fc65ed7e0bee6f61a98cc4e60e9adbc361f39340394aca3456d52a3fb196
MD5 14902e371679633f7289fe5160efc848
BLAKE2b-256 2bf092375eb3fcddf420b2a68630110983b29c2233ebee5e5b13f5353b28f683

See more details on using hashes here.

File details

Details for the file avmesos_airflow_provider-0.2.7-py3-none-any.whl.

File metadata

File hashes

Hashes for avmesos_airflow_provider-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 7625db810e2e3b0eb6de8687fe15ebe6c94019fc5c4c642c42ab7fe461c4602d
MD5 d74d4087a80a03a3a808e3788853dd8c
BLAKE2b-256 9953d4398b94a8e22af15aebcea647f3784015d8b902616b0dd045645f427c2e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page