Skip to main content

A lightweight tool for managing and executing multiple parallel experiments.

Project description

EasyRunner

Python 3.8+ License PyPI


EasyRunner is a lightweight tool for managing and executing multiple parallel experiments. It simplifies the process of running multiple experiments with different configurations or hyperparameters, while monitoring system resources.

Features

  • Run multiple experiments in parallel
  • Monitor system resources (CPU and memory usage) during experiments
  • Early termination of experiments by inputting the experiment number
  • Colorized display of experiment status and resource usage
  • Generate a list of instructions from a template and arguments

Installation

To use EasyRunner, simply download or clone this repository and then:

pip install -e .

Usage

  1. Initialize an EasyRunner object with the required parameters.
  2. Specificy a list of commandline instructions to run.
  3. Use the start method to run experiments. You can specify a list of GPU IDs for running experiments (or None by default).
  4. Optionally, use the compose method to generate a list of instructions from a template and arguments.

A simple example for running a list of instructions (2 parallel) on cuda 0, 1:

from easy_runner import EasyRunner

# Initialize the EasyRunner
runner = EasyRunner(log_name="experiment_logs")

# Create a list of instructions
instructions = [
    "python script1.py --param1 0.1 --param2 100",
    "python script1.py --param1 0.2 --param2 200",
    "python script2.py --param1 0.3 --param2 300",
    "python script2.py --param1 0.4 --param2 400",
    "python script3.py --param1 0.5 --param2 500"
]

# Run experiments
runner.start(instructions, gpus=[0, 1], max_parallel=2)

Anoter example of how to use the compose feature to perform grid search:

from easy_runner import EasyRunner

# Initialize the EasyRunner
runner = EasyRunner(log_name="experiment_logs")

# List of seeds, and tasks
seeds = [0, 10, 20]
tasks = ["TaskA-v0 --epoch 30", "TaskB-v0 --epoch 150", "TaskC-v0 --epoch 80"]

# Define the command template
template = "nohup python train_script.py --project my_project --seed {} --task {} "

# Generate a list of instructions using the compose method
instructions = runner.compose(template, [agents, seeds, tasks])

# Run the experiments
runner.start(instructions, max_parallel=4)

You can try the example scripts in the examples folder:

cd examples
python test_easy_runner.py

Demo video:

License

This project is licensed under the MIT License. See the LICENSE file for details.

Contributing

We welcome contributions to the Experiment Grid Tool. Please open an issue or submit a pull request on the GitHub repository.

Feel free to customize this template according to your specific requirements or add any additional information you think would be helpful for users.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

easy_runner-0.1.0.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

easy_runner-0.1.0-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file easy_runner-0.1.0.tar.gz.

File metadata

  • Download URL: easy_runner-0.1.0.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for easy_runner-0.1.0.tar.gz
Algorithm Hash digest
SHA256 9938624a165807f75df44602dc434a4ee311b204c0aa1a6f85b72acd9a9a3ed3
MD5 94a361ed75f04b6ed2c796f79b1b4730
BLAKE2b-256 3e1d25295f0638b98ace270476ee4ec6867ee24e205fc1aa4e1c75ef7a7d72ea

See more details on using hashes here.

File details

Details for the file easy_runner-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: easy_runner-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for easy_runner-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 894728989625b70eda227cefea144a76d8d06b30581fa7f95bdc9e08ddb1238c
MD5 2cda4e262101dc5ad708fc5a06d33374
BLAKE2b-256 9cb40e81d4018a61191699430c2b96e9fa817e8d18ef42c3b0513f95a58924b8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page