A flexible configuration system for Python projects
Project description
Hypster is a lightweight configuration framework for managing and optimizing AI & ML workflows
⚠️ Hypster is in active development and not yet battle-tested in production. If you’re gaining value and want to promote it to production, please reach out!
Key Features
- 🐍 Pythonic API: Intuitive & minimal syntax that feels natural to Python developers
- 🪆 Hierarchical, Conditional Configurations: Support for nested and swappable configurations
- 📐 Type Safety: Built-in type hints and validation
- 🧪 Hyperparameter Optimization Built-In: Native, first-class optuna support
Installation
You can install Hypster using uv:
uv add hypster
# optional HPO backend
uv add 'hypster[optuna]'
Or using pip:
pip install hypster
Quick Start
Define a configuration function and instantiate it with overrides:
from hypster import HP, explore, instantiate
from llm import LLM
def llm_config(hp: HP):
model_name = hp.select(["gpt-4o-mini", "gpt-4o"], name="model_name")
temperature = hp.float(0.7, name="temperature", min=0.0, max=1.0)
max_tokens = hp.int(256, name="max_tokens", min=1, max=4096)
llm = LLM(model_name=model_name, temperature=temperature, max_tokens=max_tokens)
return llm
explore(llm_config)
# llm_config
# ├── model_name: select = "gpt-4o-mini" (options: ["gpt-4o-mini", "gpt-4o"])
# ├── temperature: float = 0.7 (0.0-1.0)
# └── max_tokens: int = 256 (1-4096)
llm = instantiate(llm_config, values={"model_name": "gpt-4o-mini", "temperature": 0.3})
llm.invoke("How's your day going?")
Use explore(..., values=...) to inspect a specific conditional branch before you instantiate it, or explore(..., return_info=True) to get a JSON-serializable schema object.
HPO with Optuna
import optuna
from hypster.hpo.types import HpoInt, HpoFloat, HpoCategorical
from hypster.hpo.optuna import suggest_values
def objective(trial: optuna.Trial) -> float:
values = suggest_values(trial, config=model_cfg)
model = instantiate(model_cfg, values=values)
X, y = make_classification(
n_samples=400, n_features=20, n_informative=10, random_state=42
)
return cross_val_score(model, X, y, cv=3, n_jobs=-1).mean()
study = optuna.create_study(direction="maximize")
study.optimize(objective, n_trials=30)
Inspiration
Hypster draws inspiration from Meta's hydra and hydra-zen framework. The API design is influenced by Optuna's "define-by-run" API.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hypster-0.3.9.tar.gz.
File metadata
- Download URL: hypster-0.3.9.tar.gz
- Upload date:
- Size: 17.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a09854721f24a07469c9841f0db47309a04f3f15ba25a08431190dd01d171b5d
|
|
| MD5 |
c1645be391e081d9268ae408ae7b02e2
|
|
| BLAKE2b-256 |
dc455d82d13cc5296b36ffa7c819c21290f2e966bf2b722a37ae9a07dab663e0
|
File details
Details for the file hypster-0.3.9-py3-none-any.whl.
File metadata
- Download URL: hypster-0.3.9-py3-none-any.whl
- Upload date:
- Size: 19.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fab94db4aba21cc9f3594d36116803cfc71b8fedba351ef3d773dcaa098d191c
|
|
| MD5 |
b7ecd10888a395d090883820c7b3349e
|
|
| BLAKE2b-256 |
8cb5299a2fa074f314bf611ba245c908cf55f17b56ea977aa32312a0268a35d9
|