A minimal framework for running hyperparameter tuning
Project description
hpt
This repository is under construction :construction:
A minimal hyperparameter tuning framework to help you train hundreds of models.
It's essentially a set of helpful wrappers over optuna.
Install
Install package from PyPI:
pip install hyperparameter-tuning
Getting started
from hpt.tuner import ObjectiveFunction, OptunaTuner
obj_func = ObjectiveFunction(
X_train, y_train, X_test, y_test,
hyperparameter_space=HYPERPARAM_SPACE_PATH, # path to YAML file
eval_metric="accuracy",
s_train=s_train,
s_val=s_test,
threshold=0.50,
)
tuner = OptunaTuner(
objective_function=obj_func,
direction="maximize", # NOTE: can pass other useful study kwargs here (e.g. storage)
)
# Then just run optimize as you would for an optuna.Study object
tuner.optimize(n_trials=20, n_jobs=4)
# Results are stored in tuner.results
tuner.results
# You can reconstruct the best predictor with:
clf = obj_func.reconstruct_model(obj_func.best_trial)
Defining a hyperparameter space
The hyperparameter space is provided either path to a YAML file, or as a dict
with the same structure.
Example hyperparameter spaces here and
here.
The YAML file must follow this structure:
# One or more top-level algorithms
DT:
# Full classpath of algorithm's constructor
classpath: sklearn.tree.DecisionTreeClassifier
# One or more key-word arguments to be passed to the constructor
kwargs:
# Kwargs may be sampled from a distribution
max_depth:
type: int # either 'int' or 'float'
range: [ 10, 100 ] # minimum and maximum values
log: True # (optionally) whether to use logarithmic scale
# Kwargs may be sampled from a fixed set of categories
criterion:
- 'gini'
- 'entropy'
# Kwargs may be a pre-defined value
min_samples_split: 4
# You may explore multiple algorithms at once
LR:
classpath: sklearn.linear_model.LogisticRegression
kwargs:
# An example of a float hyperparameter
C:
type: float
range: [ 0.01, 1.0 ]
log: True
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for hyperparameter-tuning-0.2.10.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 96a8858e0726f6dae278a0ae8e1c671362968e36dd6c319a1e1ebc8be8179976 |
|
MD5 | 27a155a8737dea7cccdfc712f3685fd2 |
|
BLAKE2b-256 | 6d81383961c8afed2d85a269c128c9bfc1974eabaa455db92b1e3e9ed55dc71c |
Close
Hashes for hyperparameter_tuning-0.2.10-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 26241b9a83d4df36d6244e6ffd8e4a344d30d91e26789829f1fa059285e8f899 |
|
MD5 | a7f583d4656d21c81135fa6d2243fce9 |
|
BLAKE2b-256 | 17e259a5b1d7abef2aadc94881a722bb8bf4825416275d1f816c84900f6c3a95 |