No project description provided
Project description
Extensible and Fault-Tolerant Hyperparameter Management
HParams is a thoughtful approach to configuration management for machine learning projects. It enables you to externalize your hyperparameters into a configuration file. In doing so, you can reproduce experiments, iterate quickly, and reduce errors.
Features:
- Approachable and easy-to-use API
- Battle-tested over three years
- Fast with little to no runtime overhead (< 3e-05 seconds) per configured function
- Robust to most use cases with 100% test coverage and 75 tests
- Lightweight with only one dependency
Logo by Chloe Yeo, Corporate Sponsorship by WellSaid Labs
Installation
Make sure you have Python 3. You can then install hparams
using pip
:
pip install hparams
Install the latest code via:
pip install git+https://github.com/PetrochukM/HParams.git
Oops 🐛
With HParams, you will avoid common but needless hyperparameter mistakes. It will throw a warning or error if:
- A hyperparameter is overwritten.
- A hyperparameter is declared but not set.
- A hyperparameter is set but not declared.
- A hyperparameter type is incorrect.
Finally, HParams is built with developer experience in mind. HParams includes 13 errors and 6 warnings to help catch and resolve issues quickly.
Examples
Add HParams to your project by following one of these common use cases:
Configure Training 🤗
Configure your training run, like so:
# main.py
from hparams import configurable, add_config, HParams, HParam
from typing import Union
@configurable
def train(batch_size: Union[int, HParam]=HParam(int)):
pass
class Model():
@configurable
def __init__(self, hidden_size=HParam(int), dropout=HParam(float)):
pass
add_config({ 'main': {
'train': HParams(batch_size=32),
'Model.__init__': HParams(hidden_size=1024, dropout=0.25),
}})
HParams supports optional configuration typechecking to help you find bugs! 🐛
Set Defaults
Configure PyTorch and Tensorflow defaults to match via:
from torch.nn import BatchNorm1d
from hparams import configurable, add_config, HParams
# NOTE: `momentum=0.01` to match Tensorflow defaults
BatchNorm1d.__init__ = configurable(BatchNorm1d.__init__)
add_config({ 'torch.nn.BatchNorm1d.__init__': HParams(momentum=0.01) })
Configure your random seed globally, like so:
# config.py
import random
from hparams import configurable, add_config, HParams
random.seed = configurable(random.seed)
add_config({'random.seed': HParams(a=123)})
# main.py
import config
import random
random.seed()
CLI
Experiment with hyperparameters through your command line, for example:
foo@bar:~$ file.py --torch.optim.adam.Adam.__init__ 'HParams(lr=0.1,betas=(0.999,0.99))'
import sys
from torch.optim import Adam
from hparams import configurable, add_config, parse_hparam_args
Adam.__init__ = configurable(Adam.__init__)
parsed = parse_hparam_args(sys.argv[1:]) # Parse command line arguments
add_config(parsed)
Hyperparameter optimization
Hyperparameter optimization is easy to-do, check this out:
import itertools
from torch.optim import Adam
from hparams import configurable, add_config, HParams
Adam.__init__ = configurable(Adam.__init__)
def train(): # Train the model and return the loss.
pass
for betas in itertools.product([0.999, 0.99, 0.9], [0.999, 0.99, 0.9]):
add_config({Adam.__init__: HParams(betas=betas)}) # Grid search over the `betas`
train()
Track Hyperparameters
Easily track your hyperparameters using tools like Comet.
from comet_ml import Experiment
from hparams import get_config
experiment = Experiment()
experiment.log_parameters(get_config())
Multiprocessing: Partial Support
Export a Python functools.partial
to use in another process, like so:
from hparams import configurable, HParam
@configurable
def func(hparam=HParam()):
pass
partial = func.get_configured_partial()
With this approach, you don't have to transfer the global state to the new process. To transfer the
global state, you'll want to use get_config
and add_config
.
Docs 📖
The complete documentation for HParams is available here.
Contributing
We've released HParams because a lack of hyperparameter management solutions. We hope that other people can benefit from the project. We are thankful for any contributions from the community.
Contributing Guide
Read our contributing guide to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to HParams.
Authors
- Michael Petrochuk — Developer
- Chloe Yeo — Logo Design
Citing
If you find HParams useful for an academic publication, then please use the following BibTeX to cite it:
@misc{hparams,
author = {Petrochuk, Michael},
title = {HParams: Hyperparameter management solution},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/PetrochukM/HParams}},
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hparams-0.3.0.tar.gz
.
File metadata
- Download URL: hparams-0.3.0.tar.gz
- Upload date:
- Size: 12.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a3b4eec68b8f2795a0d3dd8ef1348ecf447866f2ef33eddfa1a23ad5879953fc |
|
MD5 | 86d0e3e92866752e4ee6ca635d16bdad |
|
BLAKE2b-256 | 390f9831f1e406183f7a241ac02551afa7e31900ee846cdc58f595919b6f15f3 |
File details
Details for the file hparams-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: hparams-0.3.0-py3-none-any.whl
- Upload date:
- Size: 11.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e3be8927a57b2aeb8801b9e045a62393343765aaab308b6dda015d65ebbc2160 |
|
MD5 | 0cf83a10d06d22a05c03dea07ab4ed6a |
|
BLAKE2b-256 | 2fecbcc7011ec23390ac0ccafd031ad9f850430390b4ed3a8b1550788b7fe586 |