Skip to main content

No project description provided

Project description

oh-my-batch

A simple tool to manipulate batch tasks.

The goal of this to is to provide a handy command tool for

  • omb combo: generate folders/files from different combinations of parameters
  • omb batch: generate batch scripts from multiple working directories
  • omb job: track the state of job in job schedular

Install

pip install oh-my-batch

Use cases

Generate files from different combinations of parameters

It's common to generate files with different combinations of parameters in scientific computing. For example, you have 3 LAMMPS data files in tmp directory: tmp/1.data, tmp/2.data, tmp/3.data. And you want to generate a series of input files with different parameters, for example, different temperatures 300K, 400K, 500K, against each data file.

In this case, you can use omb combo command to generate a series of input files for you.

#! /bin/bash
# prepare fake data files
mkdir -p tmp/
touch tmp/1.data tmp/2.data tmp/3.data

# prepare a lammps input file template
cat > tmp/in.lmp.tmp <<EOF
read_data $DATA_FILE
velocity all create $TEMP $RANDOM
run 1000
EOF

# prepare a run script template
cat > tmp/run.sh.tmp <<EOF
cat in.lmp  # simulate running lammps
EOF

# generate input files
omb combo \
    add_files DATA_FILE tmp/*.data - \
    add_var TEMP 300 400 500 - \
    add_randint RANDOM -n 3 -a 1 -b 1000 --broadcast - \
    make_files tmp/in.lmp.tmp tmp/tasks/{i}-T-{TEMP}/in.lmp - \
    make_files tmp/run.sh.tmp tmp/tasks/{i}-T-{TEMP}/run.sh --mode 755 - \
    done

The above script will generate 9 folders in tmp/tasks directory with names from 0-T-300, 1-T-400, 2-T-500, 3-T-300 to 8-T-500. Each folder will contain a in.lmp file and a run.sh file.

The 9 folders are the combinations of 3 data files and 3 temperatures, and each input file will have a independent random number between 1 and 1000 as RANDOM.

You can run the about script by ./examples/omb-combo.sh, and you can also run omb combo --help to see the detailed usage of combo command.

Generate batch scripts from multiple working directories

It's common to submit a lot of jobs to a job scheduler. omb batch is designed to help you generate batch scripts from multiple working directories and package them into several batch scripts.

Let's continue the above example, now you have 9 folders in tmp/tasks directory. You want to package them into 2 batch scripts to submit to a job scheduler.

You can use omb batch to generate batch scripts for you like this:

#! /bin/bash
cat > tmp/lammps_header.sh <<EOF
#!/bin/bash
#SBATCH -J lmp
#SBATCH -n 1
#SBATCH -t 1:00:00
EOF

omb batch \
    add_work_dir tmp/tasks/* - \
    add_header_file tmp/lammps_header.sh - \
    add_command "checkpoint lmp.done ./run.sh" - \
    make tmp/lmp-{i}.slurm --concurrency 2

You will find batch scripts tmp/lmp-0.slurm and tmp/lmp-1.slurm in tmp directory.

omb batch will provide some useful functions in the batch script. For example, checkpoint will check if the job is done and skip the job if it's done.

You can run the above script by ./examples/omb-batch.sh,

Track the state of job in job schedular

Let's continue the above example, now you have submitted the batch scripts to the job scheduler.

You can use omb job to track the state of the jobs.

omb job slurm \
    submit tmp/*.slurm --max_tries 3 --wait --recovery lammps-jobs.json 

The above command will submit the batch scripts to the job scheduler, and wait for the jobs to finish. If the job fails, it will retry for at most 3 times.

The --recovery option will save the job information to lammps-jobs.json file, if omb job is interrupted, you can run the exact same command to recover the job status, so that you don't need to resubmit the jobs that are already submitted.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oh_my_batch-0.1.0.dev2.tar.gz (21.5 kB view details)

Uploaded Source

Built Distribution

oh_my_batch-0.1.0.dev2-py3-none-any.whl (22.7 kB view details)

Uploaded Python 3

File details

Details for the file oh_my_batch-0.1.0.dev2.tar.gz.

File metadata

  • Download URL: oh_my_batch-0.1.0.dev2.tar.gz
  • Upload date:
  • Size: 21.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.9.12 Linux/5.15.167.4-microsoft-standard-WSL2

File hashes

Hashes for oh_my_batch-0.1.0.dev2.tar.gz
Algorithm Hash digest
SHA256 443fdd1bd81bd89ed37d5d2236d789bfd2a41072438040005805202c199d5c7a
MD5 5b7f2e023fdbd58f755c8c922410d13e
BLAKE2b-256 b8019d7d3901b7505281a4995da6c9303bc990c23d5793aca8abbbd6ebd0149d

See more details on using hashes here.

File details

Details for the file oh_my_batch-0.1.0.dev2-py3-none-any.whl.

File metadata

  • Download URL: oh_my_batch-0.1.0.dev2-py3-none-any.whl
  • Upload date:
  • Size: 22.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.9.12 Linux/5.15.167.4-microsoft-standard-WSL2

File hashes

Hashes for oh_my_batch-0.1.0.dev2-py3-none-any.whl
Algorithm Hash digest
SHA256 e545234fe0c2e373369a882694d6b0638f26333fb7b9ab3fb17708773ad4512e
MD5 37402c95f030dc849ab288df206e90db
BLAKE2b-256 49e096b565070aba637d4ad504c9947bfb368f98a4bbfd548e7e8e1c7a32a88e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page