custom decorator to turn any function slurm-deployable. Can submit multiple jobs and multiple job_arrays
Project description
Slurmomatic
A lightweight Python decorator to conditionally submit functions as SLURM jobs (or job arrays), falling back to local execution when SLURM is not available.
🚀 Key Features
- 📦 Drop-in simple: Decorate any function with
@slurmify(...). - 🔍 Auto-detects SLURM: Will submit jobs via SLURM if available, otherwise runs locally.
- ⚙️ Unified interface: Same code works on your laptop or cluster — no changes needed.
- 🧠 Smart job control: Supports both individual job submission and SLURM job arrays.
🔧 Requirements
- Python 3.10+
submitit
🧠 Usage
Step 1: Import
from slurmomatic import slurmify, batch
Step 2: Decorate your function
Each decorated function must accept a use_slurm: bool argument.
✅ Example 1: Submitting a SLURM Job Array
from slurmomatic import slurmify
@slurmify(slurm_array_parallelism=True, timeout_min=20)
def train(a: int, b: int, use_slurm: bool = False):
print(f"Training with a={a}, b={b}")
# Run job array of 5 parallel job_arrays
train([1, 2, 3, 4, 5], [10, 20, 30, 40, 50], use_slurm=True)
✅ Example 2: Submitting Multiple Individual Jobs
from slurmomatic import slurmify
@slurmify(timeout_min=10)
def run_experiment(seed: int, use_slurm: bool = False):
print(f"Running experiment with seed={seed}")
for seed in range(5):
run_experiment(seed, use_slurm=True)
Each call submits its own SLURM job (or runs locally).
✅ Example 3: Submitting Multiple Batches with Job Arrays
from slurmomatic import slurmify, batch
@slurmify(slurm_array_parallelism=10, timeout_min=30)
def evaluate(x: int, y: int, use_slurm: bool = False):
print(f"Evaluating with x={x}, y={y}")
# Prepare large input lists
xs = list(range(1000))
ys = [1] * 1000
# Submit in batches of 200 using job arrays
for x_batch, y_batch in batch(200, xs, ys):
evaluate(x_batch, y_batch, use_slurm=True)
This submits 5 SLURM job arrays, each with 200 jobs.
📦 @slurmify(...) Parameters
You can pass any SLURM submitit parameters directly to the decorator:
@slurmify(timeout_min=30, cpus_per_task=4, gpus_per_node=1, partition="gpu")
Special key:
slurm_array_parallelism=10 → Triggers job array mode.
🧰 batch(batch_size: int, *args)
Utility to chunk long input lists into mini-batches.
from slurmomatic import batch
for a_batch, b_batch in batch(100, list_a, list_b):
train(a_batch, b_batch, use_slurm=True)
🛡️ Notes
✅ If SLURM is not available (sinfo not found or no job ID in environment), the jobs run locally using submitit.LocalExecutor.
📜 License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file slurmomatic-0.1.0.tar.gz.
File metadata
- Download URL: slurmomatic-0.1.0.tar.gz
- Upload date:
- Size: 35.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a508e13ac4b55300320f1b4bdd1bd13ec4fae56710f747baf5dd747180fcee26
|
|
| MD5 |
ca00d8634fb017d2703ad6758e1acdf5
|
|
| BLAKE2b-256 |
97ede767a9faad876bee115386af3d4d4c468135c859f2e71ba2a7b67dc2e22a
|
File details
Details for the file slurmomatic-0.1.0-py3-none-any.whl.
File metadata
- Download URL: slurmomatic-0.1.0-py3-none-any.whl
- Upload date:
- Size: 5.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.5.21
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
65fbb1c7d30bd55a85ec851c9f97b62353bef4fac2b5290197f1d6f75ec0c28a
|
|
| MD5 |
4f2522472897a101084e1719402323c4
|
|
| BLAKE2b-256 |
5f5392aa69ebae0ef681a24a6c81a4cce2e095e5e5f9b0ef67b9520ffb724174
|