Skip to main content

Apache Airflow provider for orchesjob – lightweight idempotent job runner

Project description

airflow-providers-orchesjob

⚠️ EXPERIMENTAL

This package is an experimental implementation. Do not use in production. APIs and behaviour may change without notice.

Apache Airflow provider for orchesjob. Starts and monitors orchesjob jobs on a remote host over SSH.


⚠️ Known Limitations

mode="reschedule" is required

You must use OrchesJobSensor with mode="reschedule". The default mode="poke" does not work.

# ❌ BROKEN: poke mode (default) — do not use
OrchesJobSensor(
    task_id="wait",
    job_id="...",
    ssh_conn_id="my_ssh",
    poke_interval=30.0,
)

# ✅ CORRECT: always specify mode="reschedule"
OrchesJobSensor(
    task_id="wait",
    job_id="...",
    ssh_conn_id="my_ssh",
    poke_interval=30.0,
    mode="reschedule",  # required
)

In poke mode the worker process stays alive for the entire duration of the sensor. In environments such as MWAA, the Airflow server responds with Task Instance not found for tasks that remain in running state for a long time, causing the worker to forcibly terminate itself.

In reschedule mode the worker exits normally after each False return from poke(), and the scheduler re-queues the task after poke_interval seconds, avoiding this problem.


Requirements

  • Apache Airflow ≥ 2.6
  • apache-airflow-providers-ssh ≥ 3.0
  • orchesjob installed on the remote host

Installation

pip install airflow-providers-orchesjob

Setup

Register an SSH Connection in Airflow (Admin → Connections):

Field Value
Conn Id any name (e.g. my_ssh)
Conn Type SSH
Host remote host address
Username SSH username

Usage

Use OrchesJobOperator to start a job and OrchesJobSensor to wait for completion.

from airflow.decorators import dag
from airflow_providers_orchesjob.operators.orchesjob import OrchesJobOperator
from airflow_providers_orchesjob.sensors.orchesjob import OrchesJobSensor

@dag(dag_id="my_dag", ...)
def my_dag():
    start = OrchesJobOperator(
        task_id="run_job",
        command=["/jobs/import.sh", "--date", "{{ ds }}"],
        ssh_conn_id="my_ssh",
    )

    wait = OrchesJobSensor(
        task_id="wait_job",
        job_id="{{ ti.xcom_pull(task_ids='run_job', key='job_id') }}",
        ssh_conn_id="my_ssh",
        poke_interval=30.0,
        timeout=3600.0,
        mode="reschedule",  # required
    )

    start >> wait

Idempotency

run_key defaults to {dag_id}__{task_id}__{run_id}. Re-triggering the same DAG run will not re-execute the job if it is still active.

# Explicit run_key
OrchesJobOperator(
    task_id="import",
    command=["/jobs/import.sh"],
    ssh_conn_id="my_ssh",
    run_key="daily-import-{{ ds }}",
)

Set strict=True to prevent any re-execution with the same run_key, even after the previous job has finished.

Parameters

OrchesJobOperator

Parameter Type Default Description
command list[str] required Command to run on the remote host
ssh_conn_id str required Airflow SSH Connection ID
run_key str | None auto orchesjob idempotency key
strict bool False Prevent re-execution with the same run_key
orchesjob_home str | None None Override ORCHESJOB_HOME on the remote host

OrchesJobSensor

Parameter Type Default Description
job_id str required orchesjob job ID to monitor
ssh_conn_id str required Airflow SSH Connection ID
orchesjob_home str | None None Override ORCHESJOB_HOME on the remote host
poke_interval float 30.0 Seconds between polls
timeout float 3600.0 Sensor timeout in seconds
mode str "poke" Must be set to "reschedule"

Error Handling

Event Airflow behaviour
Job FAILED or LOST AirflowException → task retries apply
Job CANCELLED AirflowException
SSH connection error AirflowException → task retries apply
Sensor timeout exceeded AirflowSensorTimeout → task retries apply

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

airflow_providers_orchesjob-0.1.6.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

airflow_providers_orchesjob-0.1.6-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file airflow_providers_orchesjob-0.1.6.tar.gz.

File metadata

File hashes

Hashes for airflow_providers_orchesjob-0.1.6.tar.gz
Algorithm Hash digest
SHA256 5ce89ae200d02c11aa351131c0d6a9c57c6abe18e14904b73189452bbf31b4b4
MD5 465be478b4616e4aa454c82ac93ed348
BLAKE2b-256 11701762d078a547f305f898a7a09d5377761f3ceef2a571bef5612f647157a5

See more details on using hashes here.

Provenance

The following attestation bundles were made for airflow_providers_orchesjob-0.1.6.tar.gz:

Publisher: publish.yml on rmuraki/airflow-providers-orchesjob

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file airflow_providers_orchesjob-0.1.6-py3-none-any.whl.

File metadata

File hashes

Hashes for airflow_providers_orchesjob-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 1b80dc33a4c0ec9279ac27026a5c5951cd01c7f36ea3ce62d6b4849d939f7e16
MD5 13628060d6e63e84f12f4611e2b286dc
BLAKE2b-256 4a1a8f4c5152bf1920e8c665582dcb92d59f319b27b2169b5502f926870c73c4

See more details on using hashes here.

Provenance

The following attestation bundles were made for airflow_providers_orchesjob-0.1.6-py3-none-any.whl:

Publisher: publish.yml on rmuraki/airflow-providers-orchesjob

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page