Skip to main content

Apache Airflow provider for orchesjob – lightweight idempotent job runner

Project description

airflow-providers-orchesjob

⚠️ EXPERIMENTAL

This package is an experimental implementation. Do not use in production. APIs and behaviour may change without notice.

Apache Airflow provider for orchesjob. Starts and monitors orchesjob jobs on a remote host over SSH.


⚠️ Known Limitations

mode="reschedule" is required

You must use OrchesJobSensor with mode="reschedule". The default mode="poke" does not work.

# ❌ BROKEN: poke mode (default) — do not use
OrchesJobSensor(
    task_id="wait",
    job_id="...",
    ssh_conn_id="my_ssh",
    poke_interval=30.0,
)

# ✅ CORRECT: always specify mode="reschedule"
OrchesJobSensor(
    task_id="wait",
    job_id="...",
    ssh_conn_id="my_ssh",
    poke_interval=30.0,
    mode="reschedule",  # required
)

In poke mode the worker process stays alive for the entire duration of the sensor. In environments such as MWAA, the Airflow server responds with Task Instance not found for tasks that remain in running state for a long time, causing the worker to forcibly terminate itself.

In reschedule mode the worker exits normally after each False return from poke(), and the scheduler re-queues the task after poke_interval seconds, avoiding this problem.


Requirements

  • Apache Airflow ≥ 2.6
  • apache-airflow-providers-ssh ≥ 3.0
  • orchesjob installed on the remote host

Installation

pip install airflow-providers-orchesjob

Setup

Register an SSH Connection in Airflow (Admin → Connections):

Field Value
Conn Id any name (e.g. my_ssh)
Conn Type SSH
Host remote host address
Username SSH username

Usage

Use OrchesJobOperator to start a job and OrchesJobSensor to wait for completion.

from airflow.decorators import dag
from airflow_providers_orchesjob.operators.orchesjob import OrchesJobOperator
from airflow_providers_orchesjob.sensors.orchesjob import OrchesJobSensor

@dag(dag_id="my_dag", ...)
def my_dag():
    start = OrchesJobOperator(
        task_id="run_job",
        command=["/jobs/import.sh", "--date", "{{ ds }}"],
        ssh_conn_id="my_ssh",
    )

    wait = OrchesJobSensor(
        task_id="wait_job",
        job_id="{{ ti.xcom_pull(task_ids='run_job', key='job_id') }}",
        ssh_conn_id="my_ssh",
        poke_interval=30.0,
        timeout=3600.0,
        mode="reschedule",  # required
    )

    start >> wait

Idempotency

run_key defaults to {dag_id}__{task_id}__{run_id}. Re-triggering the same DAG run will not re-execute the job if it is still active.

# Explicit run_key
OrchesJobOperator(
    task_id="import",
    command=["/jobs/import.sh"],
    ssh_conn_id="my_ssh",
    run_key="daily-import-{{ ds }}",
)

Set strict=True to prevent any re-execution with the same run_key, even after the previous job has finished.

Parameters

OrchesJobOperator

Parameter Type Default Description
command list[str] required Command to run on the remote host
ssh_conn_id str required Airflow SSH Connection ID
run_key str | None auto orchesjob idempotency key
strict bool False Prevent re-execution with the same run_key
orchesjob_home str | None None Override ORCHESJOB_HOME on the remote host

OrchesJobSensor

Parameter Type Default Description
job_id str required orchesjob job ID to monitor
ssh_conn_id str required Airflow SSH Connection ID
orchesjob_home str | None None Override ORCHESJOB_HOME on the remote host
poke_interval float 30.0 Seconds between polls
timeout float 3600.0 Sensor timeout in seconds
mode str "poke" Must be set to "reschedule"

Error Handling

Event Airflow behaviour
Job FAILED or LOST AirflowException → task retries apply
Job CANCELLED AirflowException
SSH connection error AirflowException → task retries apply
Sensor timeout exceeded AirflowSensorTimeout → task retries apply

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_providers_orchesjob-0.1.9.tar.gz (11.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_providers_orchesjob-0.1.9-py3-none-any.whl (11.1 kB view details)

Uploaded Python 3

File details

Details for the file airflow_providers_orchesjob-0.1.9.tar.gz.

File metadata

File hashes

Hashes for airflow_providers_orchesjob-0.1.9.tar.gz
Algorithm Hash digest
SHA256 d9a43d9214a766cc3041bbf2edc542e29cc33851fd96be2feebccd8bef1aa453
MD5 abf5878635f0213b48d1878cdc47ced5
BLAKE2b-256 ba68f1b3800882da6f9f9035c4867eb875d1e511c53f9cd771c2d49402fab360

See more details on using hashes here.

Provenance

The following attestation bundles were made for airflow_providers_orchesjob-0.1.9.tar.gz:

Publisher: publish.yml on rmuraki/airflow-providers-orchesjob

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file airflow_providers_orchesjob-0.1.9-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_providers_orchesjob-0.1.9-py3-none-any.whl
Algorithm Hash digest
SHA256 771b6b7a9983676b74fb87698f30c502bd18e44227f4833dbb6d178ccac4556c
MD5 441c8050c1a2bc41051e5b32fc71033e
BLAKE2b-256 13b5431475ab98ba7cd051c41c811142fc7c078e763d8fb07485a2d9a825eb48

See more details on using hashes here.

Provenance

The following attestation bundles were made for airflow_providers_orchesjob-0.1.9-py3-none-any.whl:

Publisher: publish.yml on rmuraki/airflow-providers-orchesjob

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page