Skip to main content

Taking the pain out of choosing a Python global optimizer

Project description

humpday tests nlopt ax-platform py-bobyqa dlib hyperopt pySOT skopthebo nevergrad optuna bayesopt platypus pymoo ultraopt License: MIT

Behold! A colab notebook that recommends a black-box optimizer for your objective function.

Behold! Fifty strategies assigned Elo ratings depending on dimension of the problem and number of function evaluations allowed.

Hello and welcome to HumpDay, a package that helps you choose a Python global optimizer package, and strategy therein, from Ax-Platform, bayesian-optimization, DLib, HyperOpt, NeverGrad, Optuna, Platypus, PyMoo, PySOT, Scipy classic and shgo, Skopt, nlopt, Py-Bobyaq, UltraOpt and maybe others by the time you read this. It also presents some of their functionality in a common calling syntax.

Install base library

pip install humpday

Bleeding edge:

pip install git+https://github.com/microprediction/humpday

File an issue if you have problems. See this thread if you have issues on mac silicon M1.

This might help some of you sometimes

pip install cython pybind11
brew install openblas
export OPENBLAS=/opt/homebrew/opt/openblas/lib/

Add optimizers

pip install scikit-optimize
pip install nevergrad
pip install optuna
pip install platypus-opt
pip install poap
pip install pysot
pip install bayesian-optimization

Some of these are really good, but not 100% stable on all platforms we've used.

pip install cmake
pip install ultraopt
pip install dlib 
pip install ax-platform
pip install py-bobyqa
pip install hebo
pip install nlopt

Recommendations

Pass the dimensions of the problem, function evaluation budget and time budget to receive suggestions that are independent of your problem set,

    from pprint import pprint 
    from humpday import suggest
    pprint(suggest(n_dim=5, n_trials=130,n_seconds=5*60))

where n_seconds is the total computation budget for the optimizer (not the objective function) over all 130 function evaluations. Or simply pass your objective function, and it will time it and do something sensible:

    from humpday import recommend

    def my_objective(u):
        time.sleep(0.01)
        return u[0]*math.sin(u[1])

    recommendations = recommend(my_objective, n_dim=21, n_trials=130)

Meta-minimizer

If you are feeling lucky, the meta minimizer which will choose an optimizer based only on dimension and number of function evaluations, then run it:

    from humpday import minimize
    best_val, best_x = minimize(objective, n_dim=13, n_trials=130 )

Here and elsewhere, objective is intended to be minimized on the hypercube [0,1]^n_dim.

Points race

If you have more time, call points_race on a list of your own objective functions:

    from humpday import points_race
    points_race(objectives=[my_objective]*2,n_dim=5, n_trials=100)

See the colab notebook.

How it works

In the background, 50+ strategies are assigned Elo ratings by sister repo optimizer-elo-ratings.

Contribute

By all means contribute more to optimizers.

Articles

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

humpday-0.6.0.tar.gz (40.1 kB view hashes)

Uploaded Source

Built Distribution

humpday-0.6.0-py3-none-any.whl (55.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page