Skip to main content

A hyperparameter optimization toolbox for convenient and fast prototyping

Project description





A hyperparameter optimization and meta-learning toolbox for convenient and fast prototyping of machine-learning models.


Master status: img not loaded: try F5 :) img not loaded: try F5 :)
Dev status: img not loaded: try F5 :) img not loaded: try F5 :)
Code quality: img not loaded: try F5 :) img not loaded: try F5 :) img not loaded: try F5 :) img not loaded: try F5 :)
Latest versions: img not loaded: try F5 :) img not loaded: try F5 :)

Hyperactive is primarly a hyperparameter optimization toolkit, that aims to simplify the model-selection and -tuning process. You can use any machine- or deep-learning package and it is not necessary to learn new syntax. Hyperactive offers high versatility in model optimization because of two characteristics:

  • You can define any kind of model in the objective function. It just has to return a score/metric that gets maximized.
  • The search space accepts not just int, float or str as data types but even functions, classes or any python objects.

For more information, visualization and details about the API check out the
website




Main features


Optimization Techniques Tested and Supported Packages Optimization Extensions
Local Search: Random Methods: Markov Chain Monte Carlo: Population Methods: Sequential Methods: Machine Learning: Deep Learning: Distribution: Position Initialization: Resource Allocation:

Installation

PyPI version

The most recent version of Hyperactive is available on PyPi:

pip install hyperactive

Roadmap

v2.0.0:heavy_check_mark:
  • Change API
  • Ray integration
v2.1.0
  • Save memory of evaluations for later runs (long term memory)
  • Warm start sequence based optimizers with long term memory
  • Gaussian process regressors from various packages (gpy, sklearn, GPflow, ...) via wrapper
v2.2.0
  • Add basic dataset meta-features to long term memory
  • Enable model specific meta-learning
v2.3.0
  • Tree-structured Parzen Estimator
  • Spiral optimization
  • Downhill-Simplex-Method
v2.4.0
  • Helper-classes for early stopping
  • Helper-classes for dataset approximation

Experimental algorithms

The following algorithms are of my own design and, to my knowledge, do not yet exist in the technical literature. If any of these algorithms already exist I would like you to share it with me in an issue.

Random Annealing

A combination between simulated annealing and random search.

Scatter Initialization

Inspired by hyperband optimization.


References

[1] Proxy Datasets for Training Convolutional Neural Networks


License

LICENSE

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

hyperactive-2.1.0-py3-none-any.whl (41.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page