No project description provided
Project description
Stop messing around and organize your hyperparameters.
HParams is a configuration management solution for machine learning projects.
Logo by Chloe Yeo
Installation
Make sure you have Python 3. You can then install hparams
using pip
:
pip install hparams
Or to install the latest code via:
pip install git+https://github.com/PetrochukM/HParams.git
What is HParams?
HParams is a configuration management solution for machine learning projects. With this you can externalize your hyparameters ensuring that they are extensible, accessible and maintainable.
Technically speaking, HParams uses the @configurable
decorator to inject your hyperparameter
dependencies at runtime from a designated configuration file.
Notable Features:
- HParams is small requiring only one dependency
@configurable
adds less than 1e-05 seconds of overhead- HParams supports Python's notorious
multiprocessing
module
Basics
Add HParams to your project by following one of the common use cases:
Configure Training
Configure your training run like so:
from hparams import configurable, add_config, HParams, HParam
@configurable
def train(batch_size=HParam(int)):
pass
add_config({ train: HParams(batch_size=32) })
HParams supports optional configuration typechecking to help you find bugs. Also you can use
HParams with json
to support multiple model configurations!
Set Defaults
Configure PyTorch and Tensorflow defaults to match, enabling reproducibility, like so:
from torch.nn import BatchNorm1d
from hparams import configurable, add_config, HParams
# NOTE: `momentum=0.01` to match Tensorflow defaults
BatchNorm1d.__init__ = configurable(BatchNorm1d.__init__)
add_config({ 'torch.nn.BatchNorm1d.__init__': HParams(momentum=0.01) })
CLI
Enable rapid command line experimentation, for example:
$ file.py --torch.optim.adam.Adam.__init__ HParams(lr=0.1,betas=(0.999,0.99))
import sys
from torch.optim import Adam
from hparams import configurable, add_config, parse_hparam_args
Adam.__init__ = configurable(Adam.__init__)
parsed = parse_hparam_args(sys.argv) # Parse command line arguments
add_config(parsed)
Track Hyperparameters
Easily track your hyperparameters using tools like Comet.
from comet_ml import Experiment
from hparams import get_config
experiment = Experiment()
experiment.log_parameters(get_config())
Multiprocessing: Partial Support
Export a Python functools.partial
to use in another process like so:
from hparams import configurable, HParam
@configurable
def func(hparam=HParam(int)):
pass
partial = func.get_configured_partial()
With this approach, you don't have to transfer the entire global state to the new process.
Contributing
We've released HParams because a lack of hyperparameter management solutions. We hope that other people can benefit from the project. We are thankful for any contributions from the community.
Contributing Guide
Read our contributing guide to learn about our development process, how to propose bugfixes and improvements, and how to build and test your changes to hparams.
Authors
- Michael Petrochuk — Developer
- Chloe Yeo — Logo Design
Citing
If you find hparams useful for an academic publication, then please use the following BibTeX to cite it:
@misc{hparams,
author = {Petrochuk, Michael},
title = {HParams: Hyperparameter management},
year = {2019},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/PetrochukM/HParams}},
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hparams-0.1.1.tar.gz
.
File metadata
- Download URL: hparams-0.1.1.tar.gz
- Upload date:
- Size: 10.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5d746f95359d6f78ce2e9c451ac854458acc0f77bc018358028508d9ce587d88 |
|
MD5 | a5b9dfbe642ff2a96f41a6d549137a31 |
|
BLAKE2b-256 | bd7fd23431211aa386c77270a1257749c78c871140ad218bee205f741714564b |
File details
Details for the file hparams-0.1.1-py3-none-any.whl
.
File metadata
- Download URL: hparams-0.1.1-py3-none-any.whl
- Upload date:
- Size: 10.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 846efc02520ed204a094572471b5feff09379b1064bb68389f530268d7ab3835 |
|
MD5 | e6c5e04f0ab9f731946281154360a798 |
|
BLAKE2b-256 | ba95be8e7819bdaba289c298cbd251e3edfcbb53248ad55cc1235b494bbe5862 |