A hyperparameter optimization toolbox for convenient and fast prototyping
Project description
A hyperparameter optimization and meta-learning toolbox for convenient and fast prototyping of machine-learning models.
Master status: | |
Dev status: | |
Code quality: | |
Latest versions: |
For more information, visualization and details about the API check out the
website
Main features
- Thoroughly tested code base
- Compatible with any python machine-learning framework
- Optimize:
- Anything from simple models
to complex machine-learning-pipelines - Multi-level ensembles
- Deep neural network architecture
- Other optimization techniques (meta-optimization)
- Or any function you can specify with this API
- Anything from simple models
- Utilize state of the art optimization techniques like:
- Simulated annealing
- Evolution strategy
- Bayesian optimization
- High performance: Optimizer time is neglectable for most models
- Choose from a variety of different optimization extensions to improve the optimization
Optimization Techniques | Tested and Supported Packages | Optimization Extentions |
Local Search: Random Methods: Markov Chain Monte Carlo: Population Methods: Sequential Methods: |
Machine Learning:
Deep Learning:
Distribution:
|
Position Initialization:
Resource Allocation:
|
Installation
The most recent version of Hyperactive is available on PyPi:
pip install hyperactive
Roadmap
v2.0.0:heavy_check_mark:
- Change API
- Ray integration
v2.1.0
- Save memory of evaluations for later runs (long term memory)
- Warm start sequence based optimizers with long term memory
v2.2.0
- Tree-structured Parzen Estimator
- Spiral optimization
- Downhill-Simplex-Method
v2.3.0
- Helper-classes for model pruning
- Helper-classes for dataset approximation
Experimental algorithms
The following algorithms are of my own design and, to my knowledge, do not yet exist in the technical literature. If any of these algorithms already exist I would like you to share it with me in an issue.
Random Annealing
A combination between simulated annealing and random search.
Scatter Initialization
Inspired by hyperband optimization.
References
[1] Proxy Datasets for Training Convolutional Neural Networks
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.See tutorial on generating distribution archives.
Built Distribution
File details
Details for the file hyperactive-2.0.1-py3-none-any.whl
.
File metadata
- Download URL: hyperactive-2.0.1-py3-none-any.whl
- Upload date:
- Size: 35.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.7.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e1227e547b160d591278474a913cbcbcaee0b29bfad47b44273f29371255db08 |
|
MD5 | 5eb7a3b3693404ceaabe69290df66266 |
|
BLAKE2b-256 | ad6ed146e52877a4a5e94cef79825a8d84da68570c30f223e06a68d798b4ee93 |