A hyper-parameter library for researchers, data scientists and machine learning engineers.
Project description
HyperParameter
A hyper-parameter library for researchers, data scientists and machine learning engineers.
Quick Start
Object-Style API:
>>> from hyperparameter import HyperParameter
>>> hp = HyperParameter(a=1, b={'c': 2})
>>> hp.a == 1
True
>>> hp.b.c == 2 # (nested parameter)
True
If we want safe access to undefined parameters with default values, we can use hp()
instead of hp
:
>>> hp = HyperParameter()
>>> hp().a.b.c.get_or_else(3) # (default value for undefined parameter)
3
>>> hp().a.b.c(3) # (shortcut for `get_or_else`)
3
>>> hp().a.b.c = 4 # set value to param `a.b.c`
>>> hp().a.b.c(3) # (default value is ignored)
4
Scoped Parameter
>>> from hyperparameter import param_scope
# scoped parameter
>>> with param_scope(a=1) as ps:
... ps.a == 1
True
When nested, the parameter modifications are limited to the inner scope:
>>> with param_scope(a=1) as ps:
... with param_scope(a=2) as ps2:
... ps2.a == 2 # True, a=2 for inner scope
... ps.a == 1 # True, a=1 for outer scope
True
True
The nested scope feature can be used to config the default behavior when used in functions:
#change function behavior with scoped parameter:
def dnn(input):
# receive parameter using param_scope
with param_scope() as ps:
output = linear(inputs)
return activation_fn(
output,
activation=ps().activation("relu"))
# call function with default parameter
dnn()
# passing parameter using param_scope
with param_scope(activation="sigmoid"):
dnn()
Predefined Parameter
@auto_param #convert keyword arguments into hyper parameters
def model_train(X, y, learning_rate = 1.0, penalty = 'l1'):
LR = LogisticRegression(C=1.0,
lr=local_param('learning_rate'),
penalty=local_param('penalty'))
LR.fit(X, y)
# specify predefined parameter using `param_scope`
with param_scope('model_train.learning_rate=0.01'):
model_train(X, y)
Examples
parameter tunning for researchers
This example shows how to use hyperparameter in your research projects, and make your experiments reproducible.
experiment tracing for data scientists
This example shows experiment management with hyperparameter, and tracing the results with mlflow.tracing.
Todo.
design-pattern for system engineers
Todo.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for hyperparameter-0.3.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9b4f230b36a772e5653e288e3e6130da356021c8ac5e7f46e81b46ba077dacd6 |
|
MD5 | aea89be4ae85836084cfd3a0b1dfc670 |
|
BLAKE2b-256 | 70c91027ed79dad75efca0718ec1f0c038ee1d124a4a421e75eea328f4b8cc06 |