Skip to main content

custom decorator to turn any function slurm-deployable. Can submit multiple jobs and multiple job_arrays

Project description

Slurmomatic

A lightweight Python decorator to conditionally submit functions as SLURM jobs (or job arrays), falling back to local execution when SLURM is not available.

🚀 Key Features

  • 📦 Drop-in simple: Decorate any function with @slurmify(...).
  • 🔍 Auto-detects SLURM: Will submit jobs via SLURM if available, otherwise runs locally.
  • ⚙️ Unified interface: Same code works on your laptop or cluster — no changes needed.
  • 🧠 Smart job control: Supports both individual job submission and SLURM job arrays.

🔧 Requirements


🧠 Usage

Step 1: Import

from slurmomatic import slurmify, batch

Step 2: Decorate your function

Each decorated function must accept a use_slurm: bool argument.


✅ Example 1: Submitting a SLURM Job Array

from slurmomatic import slurmify

@slurmify(slurm_array_parallelism=True, timeout_min=20)
def train(a: int, b: int, use_slurm: bool = False):
    print(f"Training with a={a}, b={b}")

# Run job array of 5 parallel job_arrays
train([1, 2, 3, 4, 5], [10, 20, 30, 40, 50], use_slurm=True)

✅ Example 2: Submitting Multiple Individual Jobs

from slurmomatic import slurmify

@slurmify(timeout_min=10)
def run_experiment(seed: int, use_slurm: bool = False):
    print(f"Running experiment with seed={seed}")

for seed in range(5):
    run_experiment(seed, use_slurm=True)

Each call submits its own SLURM job (or runs locally).


✅ Example 3: Submitting Multiple Batches with Job Arrays

from slurmomatic import slurmify, batch

@slurmify(slurm_array_parallelism=10, timeout_min=30)
def evaluate(x: int, y: int, use_slurm: bool = False):
    print(f"Evaluating with x={x}, y={y}")
    # Prepare large input lists

xs = list(range(1000))
ys = [1] * 1000

# Submit in batches of 200 using job arrays
for x_batch, y_batch in batch(200, xs, ys):
    evaluate(x_batch, y_batch, use_slurm=True)

This submits 5 SLURM job arrays, each with 200 jobs.


📦 @slurmify(...) Parameters

You can pass any SLURM submitit parameters directly to the decorator:

@slurmify(timeout_min=30, cpus_per_task=4, gpus_per_node=1, partition="gpu")

Special key:

slurm_array_parallelism=10 → Triggers job array mode.


🧰 batch(batch_size: int, *args)

Utility to chunk long input lists into mini-batches.

from slurmomatic import batch

for a_batch, b_batch in batch(100, list_a, list_b):
    train(a_batch, b_batch, use_slurm=True)

🛡️ Notes

✅ If SLURM is not available (sinfo not found or no job ID in environment), the jobs run locally using submitit.LocalExecutor.

Todo:

  1. Need to add returns from jobs
  2. Enable requeue

📜 License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

slurmomatic-0.1.1.tar.gz (59.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

slurmomatic-0.1.1-py3-none-any.whl (8.6 kB view details)

Uploaded Python 3

File details

Details for the file slurmomatic-0.1.1.tar.gz.

File metadata

  • Download URL: slurmomatic-0.1.1.tar.gz
  • Upload date:
  • Size: 59.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.21

File hashes

Hashes for slurmomatic-0.1.1.tar.gz
Algorithm Hash digest
SHA256 66c10da20eb4c542b65e1507abc706ede69f7e1a28c4097d8d898fe97b2dd749
MD5 f8888b6a6173e604ac776357be99592e
BLAKE2b-256 887bc636bed9c49a0ea1b7af3125c86e026ebc16c37526c8a3f0db10ce799a81

See more details on using hashes here.

File details

Details for the file slurmomatic-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for slurmomatic-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 63f8add3b48d1c24fed8b43fa11f98de4c96f9473cb501c8a493e3fe5ba47c2a
MD5 b25a363f6236bd72f53ae09c688df35f
BLAKE2b-256 ddc64bcae1b58ab5f85cc2b40e33fff14cafc3aaee5c2daa4280473a710d302d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page