Skip to main content

Launch and monitor hyperparameter optimization job arrays on SLURM

Project description

HyperHerd

Hyperparameter sweeps on SLURM, run by an autonomous agent. Declare your search in YAML, hand over a one-line launcher script, and walk away — herd monitor submits trials in stages, diagnoses failures, retries the ones SLURM can fix, and pings you on Discord only when it can't.

📖 Full documentation: allenwlynch.github.io/hyperherd

What you get

  • One-command sweeps. Write a YAML, run herd run, and that's it — no sbatch boilerplate, no manual resubmits.
  • An agent that operates the sweep for you. herd monitor ramps trials in stages, diagnoses failures, bumps memory or wall-time when that's the right fix, and only interrupts you when it isn't.
  • Two-way Discord control. A dedicated channel per sweep with deterministic slash commands (/status, /run, /cancel, /tail, …) and free-form mentions for the agent.
  • Resume from anywhere. Pull the plug, edit the sweep, re-run — completed trials stick, failed ones go back to the queue.
  • Edit mid-run. Bump a learning-rate range or add a value; the next herd run appends the new trials without disturbing the ones already running.
  • Configs you don't have to memorize. A bundled Claude Code skill writes hyperherd.yaml for you from a one-paragraph description.

Hydra is the recommended trainer harness (its CLI consumes the override format natively), but the launcher is free-form bash — parse the arguments however you want.

Quick start

# Install (Python 3.8+ for the base CLI)
pip install hyperherd

# Install the Claude Code skill for authoring sweep configs
herd install-skill

# Scaffold a workspace
herd init my_experiment

# Edit my_experiment/hyperherd.yaml and my_experiment/launch.sh, then:
herd run my_experiment --dry-run    # preview
herd run my_experiment              # submit
herd status my_experiment           # one-shot status

To run the autonomous monitor (Python ≥ 3.10 + a Discord bot — see Discord setup):

pip install 'hyperherd[monitor]'
herd monitor my_experiment

Have Claude Code set you up

Open Claude Code in your project directory and paste this — it'll walk you through install, config authoring, validation, and (if you want it) the autonomous monitor end-to-end:

Help me set up HyperHerd. Read the setup guide at
https://raw.githubusercontent.com/AllenWLynch/hyperherd/main/docs/setup-help.md
and follow it — start with the Phase 0 interview, then drive the rest.

The full guide is also browsable at allenwlynch.github.io/hyperherd/setup-help.

Documentation

Requirements

  • Python ≥ 3.8 for the base herd CLI
  • Python ≥ 3.10 for the [monitor] extras (the autonomous monitor — Discord, Claude Agent SDK)
  • A SLURM cluster with sbatch, sacct, squeue, scancel on the submission host

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperherd-0.1.0.tar.gz (208.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperherd-0.1.0-py3-none-any.whl (123.0 kB view details)

Uploaded Python 3

File details

Details for the file hyperherd-0.1.0.tar.gz.

File metadata

  • Download URL: hyperherd-0.1.0.tar.gz
  • Upload date:
  • Size: 208.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hyperherd-0.1.0.tar.gz
Algorithm Hash digest
SHA256 bdcaab0932f52836a6c90069f8da3e85c416a44818d481b91b0c25d87d6cab6d
MD5 7641618fbb88e6c7b8233019da07edff
BLAKE2b-256 57e67fc4ed97eebfc1b94c4e034d8114a2972bf5aae0b93bf8904e51994b6a70

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperherd-0.1.0.tar.gz:

Publisher: publish.yml on AllenWLynch/hyperherd

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hyperherd-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: hyperherd-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 123.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hyperherd-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ce66eac5f984ada438448dbe2e91c23fcdf0d08e054cf660613a02eb19527be6
MD5 1c88a13bddb9786888d393bdda0c61d0
BLAKE2b-256 514239053d55be4c825d787b711bc216f7ade64701b9fac1a5c612b651b7a0fb

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperherd-0.1.0-py3-none-any.whl:

Publisher: publish.yml on AllenWLynch/hyperherd

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page