Skip to main content

Global, derivative- and parameter-free (hyperparameter) optimization

Project description

LIPO is a package for derivative- and parameter-free global optimization, e.g. for hyperparameter tuning. Is based on the dlib package and provides wrappers around its optimization routine.

The algorithm outperforms random search - sometimes by margins as large as 10000x. It is often preferable to Bayesian optimization which requires "tuning of the tuner". Performance is on par with moderately to well tuned Bayesian optimization.

The provided implementation has the option to automatically enlarge the search space if bounds are found to be too restrictive (i.e. the optimum being to close to one of them).

See the LIPO algorithm implementation for details.

A great blog post by the author of dlib exists, describing how it works.

Installation

Execute

pip install lipo

Usage

from lipo import GlobalOptimizer

def function(x, y, z):
    zdict = {"a": 1, "b": 2}
    return -((x - 1.23) ** 6) + -((y - 0.3) ** 4) * zdict[z]

pre_eval_x = dict(x=2.3, y=13, z="b")
evaluations = [(pre_eval_x, function(**pre_eval_x))]

search = GlobalOptimizer(
    function,
    lower_bounds={"x": -10.0, "y": -10},
    upper_bounds={"x": 10.0, "y": -3},
    categories={"z": ["a", "b"]},
    evaluations=evaluations,
    maximize=True,
)

num_function_calls = 1000
search.run(num_function_calls)

The optimizer will automatically extend the search bounds if necessary.

Further, the package provides an implementation of the scikit-learn interface for hyperparamter search.

from lipo import LIPOSearchCV

search = LIPOSearchCV(
    estimator,
    param_space={"param_1": [0.1, 100], "param_2": ["category_1", "category_2"]},
    n_iter=100
)
search.fit(X, y)
print(search.best_params_)

Comparison to other frameworks

For benchmarks, see the notebook in the benchmark directory.

scikit-optimize

This is a Bayesian framework.

+ A well-chosen prior can lead to very good results slightly faster

- If the wrong prior is chosen, tuning can take long

- It is not parameter-free - one can get stuck in a local minimum which means tuning of the tuner can be required

- LIPO can converge faster when it is close to the minimum using a quadratic approximation

- The exploration of the search space is not systematic, i.e. results can vary a lot from run to run

Optuna

+ It parallelizes very well

+ It can stop training early. This is very useful, e.g. for neural networks and can speed up tuning

+ A well-chosen prior can lead to very good results slightly faster

- If the wrong prior is chosen, tuning can take long

- It is not parameter-free, i.e. some tuning of the tuner can be required (the defaults are pretty good though)

- LIPO can converge faster when it is close to the minimum using a quadratic approximation

- The exploration of the search space is not systematic, i.e. results can vary a lot from run to run

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lipo-1.2.2.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

lipo-1.2.2-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file lipo-1.2.2.tar.gz.

File metadata

  • Download URL: lipo-1.2.2.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.7.9 Linux/5.4.0-1032-azure

File hashes

Hashes for lipo-1.2.2.tar.gz
Algorithm Hash digest
SHA256 8de6eb717d64be72b6e66fece6da88d3dd17e07836c5eb4bd0c844ba72785d31
MD5 b776db8b2fb9a006fd2db18e92893e3d
BLAKE2b-256 47dae08091e859e3913abc1da04e1aa367c7533d1d041974b94d37d24f3a6e3e

See more details on using hashes here.

File details

Details for the file lipo-1.2.2-py3-none-any.whl.

File metadata

  • Download URL: lipo-1.2.2-py3-none-any.whl
  • Upload date:
  • Size: 9.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.7.9 Linux/5.4.0-1032-azure

File hashes

Hashes for lipo-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b103315fddc6e4513d776a0fbd8cc14b23c2335fe99473ddbb993b72a0c0e3f6
MD5 2ec2fb2f48b8a8bb2b2e35721dfba06e
BLAKE2b-256 2d1c1e98cfd4ced290ef8c0da856f27e034d138f9273630aa9048d5e6ef07e16

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page