Hydra utils
Project description
pdum.hydra
A streamlined library for managing Hydra configurations with first-class support for parameter sweeps. Built on top of the Hydra framework, pdum.hydra simplifies sweep generation and configuration management for machine learning experiments. Generate all combinations of hyperparameters with ease, iterate over configurations programmatically, and manage complex experimental setups without the overhead of Hydra's CLI and job launching features. Perfect for ML experimentation workflows that need structured configs with powerful sweep capabilities.
Installation
Install using pip:
pip install habemus-papadum-hydra
Or using uv:
uv pip install habemus-papadum-hydra
Usage
Basic Parameter Sweeps
from pdum.hydra import generate_sweep_configs
# Generate sweep from config directory with parameter sweeps
runs = generate_sweep_configs(
overrides=["training.lr=0.001,0.01,0.1", "model.layers=50,101"],
config_dir="path/to/config",
config_name="config"
)
# Iterate over all run configurations
for run in runs.runs:
print(f"Running with: {run.override_dict}")
# Access the fully resolved config
config = run.config
# Your training code here
# train_model(config)
# Inspect the sweep parameters
print(f"Total runs: {len(runs.runs)}") # 6 runs (3 lr × 2 layers)
print(f"Sweep parameters: {runs.override_map}")
Using Sweep Files
For complex sweeps, use sweep configuration files. Create a file config/sweeps/experiment.yaml:
parameters:
# Arrays create sweep dimensions (cartesian product)
trial: [0, 1, 2, 3, 4]
competition:
- random-acts-of-pizza
- new-york-city-taxi-fare-prediction
- tabular-playground-series-may-2022
agent.code.model:
- o1
- o3-mini
- o1-mini
# Scalar values are applied to all runs (no sweep)
limits.steps: 500
agent.search.max_debug_depth: 20
agent.search.num_drafts: 5
limits.code_execution_time: 32400
agent.code.temp: 1.0
agent.k_fold_validation: 5
limits.total_time: 86400
Then use it in your code:
# Load sweep from file
runs = generate_sweep_configs(
overrides=["+sweeps=experiment"],
config_dir="path/to/config"
)
# This generates 5 trials × 3 competitions × 3 models = 45 runs
print(f"Total runs: {len(runs.runs)}") # 45
# All runs share the same scalar values
for run in runs.runs:
assert run.config.limits.steps == 500
assert run.config.agent.code.temp == 1.0
# But have different sweep parameter values
print(f"Trial {run.config.trial}, Model: {run.config.agent.code.model}")
Combining Sweep Files with Overrides
You can combine sweep files with additional command-line overrides:
runs = generate_sweep_configs(
overrides=[
"+sweeps=experiment", # Load base sweep
"optimizer=adam,sgd" # Add another sweep dimension
],
config_dir="path/to/config"
)
# Now: 45 runs × 2 optimizers = 90 runs
print(f"Total runs: {len(runs.runs)}") # 90
Development
This project uses UV for dependency management.
Setup
# Install UV if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh
# Clone the repository
git clone https://github.com/habemus-papadum/pdum_hydra.git
cd pdum_hydra
# Provision the entire toolchain (uv sync, pre-commit hooks)
./scripts/setup.sh
Important for Development:
./scripts/setup.shis idempotent—rerun it after pulling dependency changes- Use
uv sync --frozento ensure the lockfile is respected when installing Python deps
Running Tests
# Run all tests
uv run pytest
# Run a specific test file
uv run pytest tests/test_example.py
# Run a specific test function
uv run pytest tests/test_example.py::test_version
# Run tests with coverage
uv run pytest --cov=src/pdum/hydra --cov-report=xml --cov-report=term
Code Quality
# Check code with ruff
uv run ruff check .
# Format code with ruff
uv run ruff format .
# Fix auto-fixable issues
uv run ruff check --fix .
Building
# Build Python
./scripts/build.sh
# Or build just the Python distribution artifacts
uv build
Publishing
# Build and publish to PyPI (requires credentials)
./scripts/publish.sh
Automation scripts
./scripts/setup.sh– bootstrap uv, pnpm, widget bundle, and pre-commit hooks./scripts/build.sh– reproduce the release build locally./scripts/pre-release.sh– run the full battery of quality checks./scripts/release.sh– orchestrate the release (creates tags, publishes to PyPI/GitHub)./scripts/test_notebooks.sh– execute demo notebooks (uses./scripts/nb.shunder the hood)
License
MIT License - see LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file habemus_papadum_hydra-0.3.0.tar.gz.
File metadata
- Download URL: habemus_papadum_hydra-0.3.0.tar.gz
- Upload date:
- Size: 122.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.28.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bdcd954fa73dace7d7595bbb9f28cf4199ca4599ea2811c3380868c6afdd47f8
|
|
| MD5 |
337244f152346271593b1c0c34313bb8
|
|
| BLAKE2b-256 |
5b311055befd4df75235405173c31e3f38584211f907ab328c8f076bc942ba2f
|
File details
Details for the file habemus_papadum_hydra-0.3.0-py3-none-any.whl.
File metadata
- Download URL: habemus_papadum_hydra-0.3.0-py3-none-any.whl
- Upload date:
- Size: 10.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: python-httpx/0.28.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aea195feec87bcb6dbb4325371187551403bb80ed407a91c9f65c79dd5107998
|
|
| MD5 |
8da700fbb5a54172e1f5809b5a25cef5
|
|
| BLAKE2b-256 |
1b57cfd63801b8be23c0a0f3671fdb67077a88c342d6768175e55dc5c374d4bd
|