Skip to main content

A hyperparameter optimization toolbox for convenient and fast prototyping

Project description





A hyperparameter optimization and meta-learning toolbox for convenient and fast prototyping of machine-learning models.


Master status: img not loaded: try F5 :) img not loaded: try F5 :)
Dev status: img not loaded: try F5 :) img not loaded: try F5 :)
Code quality: img not loaded: try F5 :) img not loaded: try F5 :) img not loaded: try F5 :) img not loaded: try F5 :)
Latest versions: img not loaded: try F5 :) img not loaded: try F5 :)

Hyperactive is primarly a hyperparameter optimization toolkit, that aims to simplify the model-selection and -tuning process. You can use any machine- or deep-learning package and it is not necessary to learn new syntax. Hyperactive offers high versatility in model optimization because of two characteristics:

  • You can define any kind of model in the objective function. It just has to return a score/metric that gets maximized.
  • The search space accepts not just int, float or str as data types but even functions, classes or any python objects.

For more information, visualization and details about the API check out the
website




Main features


Optimization Techniques Tested and Supported Packages Optimization Extensions
Local Search:
Global Search:
Population Methods:
Sequential Methods:
Machine Learning:
Deep Learning:
Distribution:
Position Initialization: Resource Allocation:

Installation

PyPI version

The most recent version of Hyperactive is available on PyPi:

pip install hyperactive

Minimal example

from sklearn.model_selection import cross_val_score
from sklearn.ensemble import GradientBoostingClassifier
from sklearn.datasets import load_breast_cancer
from hyperactive import Hyperactive

data = load_breast_cancer()
X, y = data.data, data.target

'''define the model in a function'''
def model(para, X, y):
    '''optimize one or multiple hyperparameters'''
    gbc = GradientBoostingClassifier(n_estimators=para['n_estimators'])
    scores = cross_val_score(gbc, X, y)

    return scores.mean()

'''create the search space and search_config'''
search_config = {
    model: {'n_estimators': range(10, 200, 10)}
}

'''start the optimization run'''
opt = Hyperactive(X, y)
opt.search(search_config, n_iter=20)

Roadmap

v2.0.0:heavy_check_mark:
  • Change API
  • Ray integration
v2.1.0:heavy_check_mark:
  • Save memory of evaluations for later runs (long term memory)
  • Warm start sequence based optimizers with long term memory
  • Gaussian process regressors from various packages (gpy, sklearn, GPflow, ...) via wrapper
v2.2.0:heavy_check_mark:
  • Add basic dataset meta-features to long term memory
  • Add helper-functions for memory
    • connect two different model/dataset hashes
    • split two different model/dataset hashes
    • delete memory of model/dataset
    • return best known model for dataset
    • return search space for best model
    • return best parameter for best model
v2.3.0
  • Tree-structured Parzen Estimator
  • Decision Tree Optimizer
  • add "max_sample_size" and "skip_retrain" parameter for sbom to decrease optimization time
v3.0.0
  • New API
    • improve distributed computing abilities
    • separate optimizer and n_iter for each job
    • expand usage of objective-function
v3.1.0
  • Spiral optimization
  • Downhill-Simplex-Method
  • upgrade particle swarm optimization
  • upgrade evolution strategy
  • add warm start for population based optimizers
  • Meta-Optimization of local optimizers

Experimental algorithms

The following algorithms are of my own design and, to my knowledge, do not yet exist in the technical literature. If any of these algorithms already exist I would like you to share it with me in an issue.

Random Annealing

A combination between simulated annealing and random search.

Scatter Initialization

Inspired by hyperband optimization.


References

[dto] Scikit-Optimize


License

LICENSE

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

hyperactive-2.3.1-py3-none-any.whl (59.8 kB view details)

Uploaded Python 3

File details

Details for the file hyperactive-2.3.1-py3-none-any.whl.

File metadata

  • Download URL: hyperactive-2.3.1-py3-none-any.whl
  • Upload date:
  • Size: 59.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/45.2.0.post20200210 requests-toolbelt/0.9.1 tqdm/4.48.0 CPython/3.7.6

File hashes

Hashes for hyperactive-2.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 133dded23e8b68e8b9045e2d8e70f4ecc7888d0a996686c959cb06ee83a08023
MD5 e9c67cc15cc5831e1466e15daa3e02e4
BLAKE2b-256 80781f299c276d6e9bd78d3b0962161c5384a17099f9aed245521859efc54ed9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page