Skip to main content

A minimal framework for running hyperparameter tuning

Project description

hpt

Tests status PyPI status Documentation status PyPI version OSI license Python compatibility

A minimal hyperparameter tuning framework to help you train hundreds of models.

It's essentially a set of helpful wrappers over optuna.

Consult the package documentation here!

Install

Install package from PyPI:

pip install hyperparameter-tuning

Getting started

from hpt.tuner import ObjectiveFunction, OptunaTuner

obj_func = ObjectiveFunction(
    X_train, y_train, X_test, y_test,
    hyperparameter_space=HYPERPARAM_SPACE_PATH,    # path to YAML file
    eval_metric="accuracy",
    s_train=s_train,
    s_val=s_test,
    threshold=0.50,
)

tuner = OptunaTuner(
    objective_function=obj_func,
    direction="maximize",    # NOTE: can pass other useful study kwargs here (e.g. storage)
)

# Then just run optimize as you would for an optuna.Study object
tuner.optimize(n_trials=20, n_jobs=4)

# Results are stored in tuner.results
tuner.results

# You can reconstruct the best predictor with:
clf = obj_func.reconstruct_model(obj_func.best_trial)

Defining a hyperparameter space

The hyperparameter space is provided either path to a YAML file, or as a dict with the same structure. Example hyperparameter spaces here.

The YAML file must follow this structure:

# One or more top-level algorithms
DT:  
    # Full classpath of algorithm's constructor
    classpath: sklearn.tree.DecisionTreeClassifier
    
    # One or more key-word arguments to be passed to the constructor
    kwargs:
        
        # Kwargs may be sampled from a distribution
        max_depth:
            type: int           # either 'int' or 'float'
            range: [ 10, 100 ]  # minimum and maximum values
            log: True           # (optionally) whether to use logarithmic scale
        
        # Kwargs may be sampled from a fixed set of categories
        criterion:
            - 'gini'
            - 'entropy'
        
        # Kwargs may be a pre-defined value
        min_samples_split: 4


# You may explore multiple algorithms at once
LR:
    classpath: sklearn.linear_model.LogisticRegression
    kwargs:
        # An example of a float hyperparameter
        C:
            type: float
            range: [ 0.01, 1.0 ]
            log: True

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperparameter_tuning-0.3.2.tar.gz (24.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperparameter_tuning-0.3.2-py3-none-any.whl (26.0 kB view details)

Uploaded Python 3

File details

Details for the file hyperparameter_tuning-0.3.2.tar.gz.

File metadata

  • Download URL: hyperparameter_tuning-0.3.2.tar.gz
  • Upload date:
  • Size: 24.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for hyperparameter_tuning-0.3.2.tar.gz
Algorithm Hash digest
SHA256 673abb105aaf90c134e26a4d6a1fdd3ee22d1b4fff1d89522e088e2826a030bf
MD5 db97dfc00a2bb1794d2c9486e7527c02
BLAKE2b-256 647fd84c70259184ba5c8d7a8cda7a514896065f673329bc53ecf7e38eab0633

See more details on using hashes here.

File details

Details for the file hyperparameter_tuning-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for hyperparameter_tuning-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 71939a31108774b2933c3d19bea4512962a4c72f99bf4f922d0f35252e72467f
MD5 a1d9742a8748ead0623d537d33fa407f
BLAKE2b-256 a766450903183e0f44aca23ebc4f3b1679770571fee65b1feea661deef9eb5b6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page