Skip to main content

No project description provided

Project description

oh-my-batch

A simple tool to manipulate batch tasks.

The goal of this to is to provide a handy command tool for

  • omb combo: generate folders/files from different combinations of parameters
  • omb batch: generate batch scripts from multiple working directories
  • omb job: track the state of job in job schedular

Install

pip install oh-my-batch

Use cases

Generate files from different combinations of parameters

It's common to generate files with different combinations of parameters in scientific computing. For example, you have 3 LAMMPS data files in tmp directory: tmp/1.data, tmp/2.data, tmp/3.data. And you want to generate a series of input files with different parameters, for example, different temperatures 300K, 400K, 500K, against each data file.

In this case, you can use omb combo command to generate a series of input files for you.

#! /bin/bash
# prepare fake data files
mkdir -p tmp/
touch tmp/1.data tmp/2.data tmp/3.data

# prepare a lammps input file template
cat > tmp/in.lmp.tmp <<EOF
read_data $DATA_FILE
velocity all create $TEMP $RANDOM
run 1000
EOF

# prepare a run script template
cat > tmp/run.sh.tmp <<EOF
cat in.lmp  # simulate running lammps
EOF

# generate input files
omb combo \
    add_files DATA_FILE tmp/*.data - \
    add_var TEMP 300 400 500 - \
    add_randint RANDOM -n 3 -a 1 -b 1000 --broadcast - \
    make_files tmp/in.lmp.tmp tmp/tasks/{i}-T-{TEMP}/in.lmp - \
    make_files tmp/run.sh.tmp tmp/tasks/{i}-T-{TEMP}/run.sh --mode 755 - \
    done

The above script will generate 9 folders in tmp/tasks directory with names from 0-T-300, 1-T-400, 2-T-500, 3-T-300 to 8-T-500. Each folder will contain a in.lmp file and a run.sh file.

The 9 folders are the combinations of 3 data files and 3 temperatures, and each input file will have a independent random number between 1 and 1000 as RANDOM.

You can run the about script by ./examples/omb-combo.sh, and you can also run omb combo --help to see the detailed usage of combo command.

Generate batch scripts from multiple working directories

It's common to submit a lot of jobs to a job scheduler. omb batch is designed to help you generate batch scripts from multiple working directories and package them into several batch scripts.

Let's continue the above example, now you have 9 folders in tmp/tasks directory. You want to package them into 2 batch scripts to submit to a job scheduler.

You can use omb batch to generate batch scripts for you like this:

#! /bin/bash
cat > tmp/lammps_header.sh <<EOF
#!/bin/bash
#SBATCH -J lmp
#SBATCH -n 1
#SBATCH -t 1:00:00
EOF

omb batch \
    add_work_dir tmp/tasks/* - \
    add_header_file tmp/lammps_header.sh - \
    add_command "checkpoint lmp.done ./run.sh" - \
    make tmp/lmp-{i}.slurm --concurrency 2

You will find batch scripts tmp/lmp-0.slurm and tmp/lmp-1.slurm in tmp directory.

omb batch will provide some useful functions in the batch script. For example, checkpoint will check if the job is done and skip the job if it's done.

You can run the above script by ./examples/omb-batch.sh,

Track the state of job in job schedular

Let's continue the above example, now you have submitted the batch scripts to the job scheduler.

You can use omb job to track the state of the jobs.

omb job slurm \
    submit tmp/*.slurm --max_tries 3 --wait --recovery lammps-jobs.json 

The above command will submit the batch scripts to the job scheduler, and wait for the jobs to finish. If the job fails, it will retry for at most 3 times.

The --recovery option will save the job information to lammps-jobs.json file, if omb job is interrupted, you can run the exact same command to recover the job status, so that you don't need to resubmit the jobs that are already submitted.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oh_my_batch-0.1.0.dev1.tar.gz (21.6 kB view details)

Uploaded Source

Built Distribution

oh_my_batch-0.1.0.dev1-py3-none-any.whl (22.8 kB view details)

Uploaded Python 3

File details

Details for the file oh_my_batch-0.1.0.dev1.tar.gz.

File metadata

  • Download URL: oh_my_batch-0.1.0.dev1.tar.gz
  • Upload date:
  • Size: 21.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.9.12 Linux/5.15.167.4-microsoft-standard-WSL2

File hashes

Hashes for oh_my_batch-0.1.0.dev1.tar.gz
Algorithm Hash digest
SHA256 f249426b39b534edda8421df5a3472746bf2da4d9f42a315ba6413be6b8dda21
MD5 ce95f95dedb3e9dd1d6769f87ceedd5e
BLAKE2b-256 358ec9275b550cc91c6ebd14a6f304119a9db08527ae36579486873b5ba8d41f

See more details on using hashes here.

File details

Details for the file oh_my_batch-0.1.0.dev1-py3-none-any.whl.

File metadata

  • Download URL: oh_my_batch-0.1.0.dev1-py3-none-any.whl
  • Upload date:
  • Size: 22.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.9.12 Linux/5.15.167.4-microsoft-standard-WSL2

File hashes

Hashes for oh_my_batch-0.1.0.dev1-py3-none-any.whl
Algorithm Hash digest
SHA256 c2a01c367d2c6f5e4501e4ba96ce2b1e596432fba71ed94c18209f027470daf2
MD5 198ddf003c513510119e43be33187500
BLAKE2b-256 dc47a31722180f3739514e079458d3734b9ae989eb3689f399ea2ae616133be5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page