A minimal framework for running hyperparameter tuning
Project description
hpt
This repository is under construction :construction:
A minimal hyperparameter tuning framework to help you train hundreds of models.
It's essentially a set of helpful wrappers over optuna.
Install
pip install hpt
Getting started
from hpt.tuner import ObjectiveFunction, OptunaTuner
obj_func = ObjectiveFunction(
X_train, y_train, X_test, y_test,
hyperparameter_space=HYPERPARAM_SPACE_PATH,
eval_metric='accuracy',
s_train=s_train,
s_val=s_test,
threshold=0.50,
)
tuner = OptunaTuner(obj_func) # NOTE: can pass other useful study kwargs here (e.g. storage)
TODO: finish readme.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for hyperparameter-tuning-0.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7e05a02cc044c9f779567ecdc8df19aa72a11b2edf00a74a5b66a87d5aeba244 |
|
MD5 | cce2e6f66c6821a2198b975d0f3e0db3 |
|
BLAKE2b-256 | 79bbf4525ef82c3cac3884580cd4097ad623f0d3b7327a7bf7dc968db6361ebb |
Close
Hashes for hyperparameter_tuning-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a03c100e6469dd6001995037f329d98aebc7eac1c4a6a794b46e9db1c83c595c |
|
MD5 | 528635128b78ccf3b4ba4f6dad328c1c |
|
BLAKE2b-256 | e243234e01ebff1598e56f016f6e78664f5f94a879d35de4ed18da06a0023053 |