Skip to main content

Dependency-free script to spool jobs into SLURM scheduler without exceeding queue capacity limits.

Project description

Usage

You need to submit more slurm scripts than fit on the queue at once.

tree .
.
├── slurmscript0.slurm.sh
├── slurmscript1.slurm.sh
├── slurmscript2.slurm.sh
├── slurmscript3.slurm.sh
├── slurmscript4.slurm.sh
├── slurmscript5.slurm.sh
├── slurmscript6.slurm.sh
├── slurmscript7.slurm.sh
├── slurmscript8.slurm.sh
...

The qspool script will feed your job scripts onto the queue as space becomes available.

python3 -m qspool *.slurm.sh

You can also provide job names via stdin, which is useful for very large job batches.

find . -maxdepth 1 -name '*.slurm.sh' | python3 -m qspool

The qspool script creates a slurm job that submits your job scripts. When queue capacity fills, this qspool job will schedule a follow-up job to submit any remaining job scripts. This process continues until all job scripts have been submitted.

usage: qspool.py [-h] [--payload-job-script-paths-infile PAYLOAD_JOB_SCRIPT_PATHS_INFILE] [--job-log-path JOB_LOG_PATH] [--job-script-cc-path JOB_SCRIPT_CC_PATH]
                 [--queue-capacity QUEUE_CAPACITY] [--qspooler-job-title QSPOOLER_JOB_TITLE]
                 [payload_job_script_paths ...]

positional arguments:
  payload_job_script_paths
                        What scripts to spool onto slurm queue? (default: None)

options:
  -h, --help            show this help message and exit
  --payload-job-script-paths-infile PAYLOAD_JOB_SCRIPT_PATHS_INFILE
                        Where to read script paths to spool onto slurm queue? (default: <_io.TextIOWrapper name='<stdin>' mode='r' encoding='utf-8'>)
  --job-log-path JOB_LOG_PATH
                        Where should logs for qspool jobs be written? (default: ~/joblog/)
  --job-script-cc-path JOB_SCRIPT_CC_PATH
                        Where should copies of submitted job scripts be kept? (default: ~/jobscript/)
  --queue-capacity QUEUE_CAPACITY
                        How many jobs can be running or waiting at once? (default: 1000)
  --qspooler-job-title QSPOOLER_JOB_TITLE
                        What title should be included in qspooler job names? (default: none)

Installation

no installation:

python3 "$(tmpfile="$(mktemp)"; curl -s https://raw.githubusercontent.com/mmore500/qspool/v0.5.0/qspool.py > "${tmpfile}"; echo "${tmpfile}")" [ARGS]

pip installation:

python3 -m pip install qspool
python3 -m qspool [ARGS]

qspool has zero dependencies, so no setup or maintenance is required to use it. Compatible all the way back to Python 3.6, so it will work on your cluster's ancient Python install.

How it Works

qspool
  * read contents of target slurm scripts
  * instantiate qspooler job script w/ target slurm scripts embedded
  * submit qspooler job script to slurm queue

⬇️ ⬇️ ⬇️

qspooler job 1
  * submit embedded target slurm scripts one by one until queue is almost full
  * instantiate qspooler job script w/ remaining target slurm scripts embedded
  * submit qspooler job script to slurm queue

⬇️ ⬇️ ⬇️

qspooler job 2
  * submit embedded target slurm scripts one by one until queue is almost full
  * instantiate qspooler job script w/ remaining target slurm scripts embedded
  * submit qspooler job script to slurm queue

...

qspooler job n
  * submit embedded target slurm scripts one by one
  * no embedded target slurm scripts remain
  * exit

Related Software

roll_q uses a similar approach to solve this problem. roll_q differs in implementation strategy. roll_q tracks submission progress via an index variable in a file associated with a job batch. qspool embeds jobs in the submission worker script itself.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qspool-0.5.0.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

qspool-0.5.0-py3-none-any.whl (7.1 kB view details)

Uploaded Python 3

File details

Details for the file qspool-0.5.0.tar.gz.

File metadata

  • Download URL: qspool-0.5.0.tar.gz
  • Upload date:
  • Size: 6.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for qspool-0.5.0.tar.gz
Algorithm Hash digest
SHA256 eb676941d0afd5d9e1a66076cdbef55e216c10613b8a1ba716535549bcbd16b9
MD5 5dbe0fa425c0a69f1f893b68ead7990b
BLAKE2b-256 4e0d7f7697019093cc68c78c114590b1b2e97a48351e108e78b5f4c96fd97287

See more details on using hashes here.

File details

Details for the file qspool-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: qspool-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 7.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for qspool-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8a62fdb1da09ac9a3a018e54cd577d79226eeb82d8f778a980dc3ca91bb28f24
MD5 6c3977a57526b7167eaca2397dcdbe9b
BLAKE2b-256 cca856b32656867718f46c2d573798c0ca1863f242327a17efc01443e2bcf641

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page