Apache Mesos Provider
Project description
Provider for Apache Airflow 2.x to schedule Apache Mesos
This provider for Apache Airflow contain the following features:
- MesosExecuter - A scheduler to run Airflow DAG's on mesos
- MesosOperator - To executer Airflow tasks on mesos. (TODO)
Issues
To open an issue, please use this place: https://github.com/m3scluster/airflow-provider-mesos/issues
Requirements
- Airflow 2.x
- Apache Mesos minimum 1.6.x
How to install and configure
On the Airflow Server, we have to install the mesos provider.
pip install avmesos_airflow_provider
Then we will configure Airflow.
vim airflow.cfg
executor = avmesos_airflow_provider.executors.mesos_executor.MesosExecutor
[mesos]
mesos_ssl = True
master = leader.mesos:5050
framework_name = Airflow
checkpoint = True
attributes = False
failover_timeout = 604800
command_shell = True
task_cpu = 1
task_memory = 20000
authenticate = True
default_principal = <MESOS USER>
default_secret = <MESOS PASSWORD>
docker_image_slave = <AIRFLOW DOCKER IMAGE>
docker_volume_driver = local
docker_volume_dag_name = airflowdags
docker_volume_dag_container_path = /home/airflow/airflow/dags/
docker_sock = /var/run/docker.sock
docker_volume_logs_name = airflowlogs
docker_volume_logs_container_path = /home/airflow/airflow/logs/
docker_environment = '[{ "name":"<KEY>", "value":"<VALUE>" }, { ... }]'
api_username = <USERNAME FOR THIS API>
api_password = <PASSWORD FOR THIS API>
DAG example with mesos executor
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.docker.operators.docker import DockerOperator
from airflow.operators.python import PythonOperator
default_args = {
'owner' : 'airflow',
'description' : 'Use of the DockerOperator',
'depend_on_past' : True,
}
with DAG('docker_dag2', default_args=default_args, schedule_interval="*/10 * * * * ", catchup=True, start_date=datetime.now()) as dag:
t2 = DockerOperator(
task_id='docker_command',
image='centos:latest',
api_version='auto',
auto_remove=False,
command="/bin/sleep 600",
docker_url='unix:///var/run/docker.sock',
executor_config={
"cpus": 2.0,
"mem_limit": 2048
}
)
t2
Development
For development and testing we deliver a nix-shell file to install airflow, our airflow provider and postgresql. To use it, please follow the following steps:
- Run mesos-mini:
docker run --rm --name mesos --privileged=true --shm-size=30gb -it --net host avhost/mesos-mini:1.11.0-0.2.0-1 /lib/systemd/systemd
- Use nix-shell:
nix-shell
> airflow scheduler
- On the mesos-ui (http://localhost:5050) you will see Airflow as framework.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file avmesos_airflow_provider-0.2.7.tar.gz
.
File metadata
- Download URL: avmesos_airflow_provider-0.2.7.tar.gz
- Upload date:
- Size: 1.2 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c8e7fc65ed7e0bee6f61a98cc4e60e9adbc361f39340394aca3456d52a3fb196 |
|
MD5 | 14902e371679633f7289fe5160efc848 |
|
BLAKE2b-256 | 2bf092375eb3fcddf420b2a68630110983b29c2233ebee5e5b13f5353b28f683 |
File details
Details for the file avmesos_airflow_provider-0.2.7-py3-none-any.whl
.
File metadata
- Download URL: avmesos_airflow_provider-0.2.7-py3-none-any.whl
- Upload date:
- Size: 14.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7625db810e2e3b0eb6de8687fe15ebe6c94019fc5c4c642c42ab7fe461c4602d |
|
MD5 | d74d4087a80a03a3a808e3788853dd8c |
|
BLAKE2b-256 | 9953d4398b94a8e22af15aebcea647f3784015d8b902616b0dd045645f427c2e |