A Python toolbox for performing gradient-free optimization
Project description
Nevergrad - A gradient-free optimization platform
nevergrad
is a Python 3.6+ library. It can be installed with:
pip install nevergrad
More installation options and complete instructions are available in the "Getting started" section of the documentation.
You can join Nevergrad users Facebook group here.
Minimizing a function using an optimizer (here OnePlusOne
) is straightforward:
import nevergrad as ng
def square(x):
return sum((x - .5)**2)
optimizer = ng.optimizers.OnePlusOne(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation) # optimal args and kwargs
>>> Array{(2,)}[recombination=average,sigma=1.0]:[0.49971112 0.5002944 ]
Convergence of a population of points to the minima with two-points DE.
Documentation
Check out our documentation! It's still a work in progress, don't hesitate to submit issues and/or PR to update it and make it clearer!
Citing
@misc{nevergrad,
author = {J. Rapin and O. Teytaud},
title = {{Nevergrad - A gradient-free optimization platform}},
year = {2018},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}
License
nevergrad
is released under the MIT license. See LICENSE for additional details about it.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for nevergrad-0.4.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 62add17ee9d1010ff8ceb33f27f8e2dd4c6e7dc3ad7973eb578bc230247258c1 |
|
MD5 | 72c2282963e71d93b239bf9b0a418439 |
|
BLAKE2b-256 | 093308dae45f0a2ae5eb6f9c65bfd82afc93d5832a6ffabddd4f68f02025130a |