Skip to main content

Launch and monitor hyperparameter optimization job arrays on SLURM

Project description

HyperHerd

⚠️ Pre-release / actively developed. HyperHerd is in soft launch — the YAML schema, CLI flags, and Python API may change without notice between versions. Pin to an exact version (hyperherd==X.Y.Z) if you build on top of it, and expect breaking changes until a tagged 1.0.

Hyperparameter sweeps on SLURM, run by an autonomous agent. Declare your search in YAML, hand over a one-line launcher script, and walk away — herd monitor submits trials in stages, diagnoses failures, retries the ones SLURM can fix, and pings you on Discord only when it can't.

📖 Full documentation: allenwlynch.github.io/hyperherd

What you get

  • One-command sweeps. Write a YAML, run herd run, and that's it — no sbatch boilerplate, no manual resubmits.
  • An agent that operates the sweep for you. herd monitor ramps trials in stages, diagnoses failures, bumps memory or wall-time when that's the right fix, and only interrupts you when it isn't.
  • Two-way Discord control. A dedicated channel per sweep with deterministic slash commands (/status, /run, /cancel, /tail, …) and free-form mentions for the agent.
  • Resume from anywhere. Pull the plug, edit the sweep, re-run — completed trials stick, failed ones go back to the queue.
  • Edit mid-run. Bump a learning-rate range or add a value; the next herd run appends the new trials without disturbing the ones already running.
  • Configs you don't have to memorize. A bundled Claude Code skill writes hyperherd.yaml for you from a one-paragraph description.

Hydra is the recommended trainer harness (its CLI consumes the override format natively), but the launcher is free-form bash — parse the arguments however you want.

Quick start

# Install (Python 3.8+ for the base CLI)
pip install hyperherd

# Install the Claude Code skill for authoring sweep configs
herd install-skill

# Scaffold a workspace
herd init my_experiment

# Edit my_experiment/hyperherd.yaml and my_experiment/launch.sh, then:
herd run my_experiment --dry-run    # preview
herd run my_experiment              # submit
herd status my_experiment           # one-shot status

To run the autonomous monitor (Python ≥ 3.10 + a Discord bot — see Discord setup):

pip install 'hyperherd[monitor]'
herd monitor my_experiment

Have Claude Code set you up

Open Claude Code in your project directory and paste this — it'll walk you through install, config authoring, validation, and (if you want it) the autonomous monitor end-to-end:

Help me set up HyperHerd. Read the setup guide at
https://raw.githubusercontent.com/AllenWLynch/hyperherd/main/docs/setup-help.md
and follow it — start with the Phase 0 interview, then drive the rest.

The full guide is also browsable at allenwlynch.github.io/hyperherd/setup-help.

Documentation

Requirements

  • Python ≥ 3.8 for the base herd CLI
  • Python ≥ 3.10 for the [monitor] extras (the autonomous monitor — Discord, Claude Agent SDK)
  • A SLURM cluster with sbatch, sacct, squeue, scancel on the submission host

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperherd-0.1.1.tar.gz (223.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperherd-0.1.1-py3-none-any.whl (131.5 kB view details)

Uploaded Python 3

File details

Details for the file hyperherd-0.1.1.tar.gz.

File metadata

  • Download URL: hyperherd-0.1.1.tar.gz
  • Upload date:
  • Size: 223.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hyperherd-0.1.1.tar.gz
Algorithm Hash digest
SHA256 79c5d0b8af7b4bd3a0d960ccab12bb545de59f88fd9396c1fe659955489d884e
MD5 2116fe0fddf7d11efe45fb427d99efae
BLAKE2b-256 94bc2d1231987a5a6ad183a678cbe7735618514856eb16ce21c4e78559b40f7a

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperherd-0.1.1.tar.gz:

Publisher: publish.yml on AllenWLynch/hyperherd

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hyperherd-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: hyperherd-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 131.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hyperherd-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 77f4f5d720d6eee855fca9ed4463fd7e2f7d3888ec5ba022abce7dbb47c516c0
MD5 591c30c68ce530563d8ccd00bfcfbae1
BLAKE2b-256 7404d36a96c7abff142d152be80907acf8df42e128827c38b3da25c97b33db6c

See more details on using hashes here.

Provenance

The following attestation bundles were made for hyperherd-0.1.1-py3-none-any.whl:

Publisher: publish.yml on AllenWLynch/hyperherd

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page