No project description provided
Project description
Gradient-Free-Optimizers
A collection of gradient free optimizers.
Master status: | |
Code quality: | |
Latest versions: |
Introduction
Gradient-Free-Optimizers provides a collection of optimization techniques, that do not require the gradient of a given point in the search space to calculate the next one. This makes gradient-free optimization methods capable of performing hyperparameter-optimization of machine learning methods. The optimizers in this package only requires the score of the point to decide which point to evaluate next.
GFOs-design
This package was created as the optimization backend of the Hyperactive package. Therefore the API of Gradient-Free-Optimizers is not designed for easy usage. Hyperactive provides a much simpler user experience. However the separation of Gradient-Free-Optimizers from Hyperactive enables multiple advantages:
- Other developers can easily use GFOs as an optimizaton backend if desired
- Separate and more thorough testing
- Better isolation from the complex information flow in Hyperactive. GFOs only uses positions and scores in a N-dimensional search-space. It returns only the new position after each iteration.
- a smaller and cleaner code base, if you want to explore my implementation of these optimization techniques.
API
GFOs provides a collection of local, global, population-based and sequential optimization techniques:
- HillClimbingOptimizer
- StochasticHillClimbingOptimizer
- TabuOptimizer
- RandomSearchOptimizer
- RandomRestartHillClimbingOptimizer
- RandomAnnealingOptimizer
- SimulatedAnnealingOptimizer
- StochasticTunnelingOptimizer
- ParallelTemperingOptimizer
- ParticleSwarmOptimizer
- EvolutionStrategyOptimizer
- BayesianOptimizer
- TreeStructuredParzenEstimators
- DecisionTreeOptimizer
Class arguments:
- init_positions (List of numpy arrays. Each array is one start point.)
- space_dim (N-dim numpy array. Determines the size of each dimension.)
- opt_para (Dictionary of optimization parameter)
I wanted to design GFOs so that it only takes the most basic information in each iteration step. Every gradient free optimization technique should work by only receiving the score of a position in the search space. The score enables the optimizer to decide where to search next. Additionally, my optimizers also need the iteration number of the current iteration. This is an important design choice to make the usage of single- and population-based optimization techniques the same. The iteration number tells e.g. the EvolutionStrategyOptimizer when to start a new population.
Methods:
- init_pos(nth_init)
- iterate(nth_iter)
- evaluate(score_new)
Usage
import numpy as np
from gradient_free_optimizers import HillClimbingOptimizer
n_iter = 10
# objective function must be provided by user
def get_score(pos_new):
x1 = pos_new[0]
return -x1 * x1
# GFOs must know the dimension of the search space and the initial positions
space_dim = np.array([100]) # This is a 1D search-space with 100 positions to explore
init_positions = [np.array([10])] # GFOs will start at a single position: 10
opt = HillClimbingOptimizer(init_positions, space_dim, opt_para={})
# Initialize the starting positions in this loop
for nth_init in range(len(init_positions)):
pos_new = opt.init_pos(nth_init)
score_new = get_score(pos_new) # score must be provided by objective-function
opt.evaluate(score_new)
# Optimization iteration
for nth_iter in range(len(init_positions), n_iter):
pos_new = opt.iterate(nth_iter)
score_new = get_score(pos_new) # score must be provided by objective-function
opt.evaluate(score_new)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for gradient_free_optimizers-0.1.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | cee08eb9d7c5dec04449ecd9d0cc08df3ac1db6e0cd40685e862adbfec101d67 |
|
MD5 | d823b1b95ab9a7a63f870bbf8e2526ab |
|
BLAKE2b-256 | 6ae4e44a35801e1fa98414e453a90444388cd868c348647dd65d17f81c9173a5 |