Skip to main content

No project description provided

Project description

nshrunner

nshrunner is a Python library that provides a unified way to run functions in various environments, such as local dev machines, cloud VMs, SLURM clusters, and LSF clusters. It was created to simplify the process of running ML training jobs across multiple machines and environments.

Motivation

When running ML training jobs on different machines and environments, it can be challenging to manage the specifics of each environment. nshrunner was developed to address this issue by providing a single function that can be used to run jobs on any supported environment without having to worry about the details of each environment.

Features

  • Supports running functions locally, on SLURM clusters, and on LSF clusters
  • Provides a unified interface for running functions across different environments
  • Allows for easy configuration of job options, such as resource requirements and environment variables
  • Supports snapshotting the environment to ensure reproducibility, using the nshsnap library
  • Provides utilities for logging, seeding, and signal handling

Installation

nshrunner can be installed using pip:

pip install nshrunner

Usage

Here's a simple example of how to use nshrunner to run a function locally:

import nshrunner as R

def run_fn(x: int):
    return x + 5

runs = [(1,)]

runner = R.Runner(run_fn)
list(runner.local(runs))

To run the same function on a SLURM cluster:

runner.submit_slurm(
    runs,
    {
        "partition": "learnaccel",
        "nodes": 4,
        "ntasks_per_node": 8,  # Change this to limit # of GPUs
        "gpus_per_task": 1,
        "cpus_per_task": 1,
    },
    snapshot=True,
)

And on an LSF cluster:

runner.submit_lsf(
    runs,
    {
        "summit": True,
        "queue": "learnaccel",
        "nodes": 4,
        "rs_per_node": 8,  # Change this to limit # of GPUs
    },
    snapshot=True,
)

For more detailed usage examples, please refer to the documentation.

Acknowledgements

nshrunner is heavily inspired by submitit. It builds on submitit's design and adds support for LSF clusters, snapshotting, and other features.

Contributing

Contributions are welcome! For feature requests, bug reports, or questions, please open an issue on GitHub. If you'd like to contribute code, please submit a pull request with your changes.

License

nshrunner is released under the MIT License. See LICENSE for more information.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nshrunner-0.15.0.tar.gz (26.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nshrunner-0.15.0-py3-none-any.whl (35.5 kB view details)

Uploaded Python 3

File details

Details for the file nshrunner-0.15.0.tar.gz.

File metadata

  • Download URL: nshrunner-0.15.0.tar.gz
  • Upload date:
  • Size: 26.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.8.0-39-generic

File hashes

Hashes for nshrunner-0.15.0.tar.gz
Algorithm Hash digest
SHA256 debbcf6874c1e20c0e6574264e51e8db7a0a14d6e3417d3fe60e2fe534010cef
MD5 2368ad1a417c57b345d2df03ce635afc
BLAKE2b-256 b9034a01a7fa396aa1951a68a08319dd5dcf2d3691b64d1fd967620b16312889

See more details on using hashes here.

File details

Details for the file nshrunner-0.15.0-py3-none-any.whl.

File metadata

  • Download URL: nshrunner-0.15.0-py3-none-any.whl
  • Upload date:
  • Size: 35.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.8.0-39-generic

File hashes

Hashes for nshrunner-0.15.0-py3-none-any.whl
Algorithm Hash digest
SHA256 08b3ffdddc278d29f1395d319b5fd2723a0686bf842d2f0055068a93c0fa1bef
MD5 c880a22661871079cfc3b20c1c3e430b
BLAKE2b-256 950e98258ada8940e4fdb8a51146ef311d4e2decec5f7f0b6d7946bd676831b5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page