Skip to main content

A Python toolbox for performing gradient-free optimization

Project description


Nevergrad - A gradient-free optimization platform

nevergrad is a Python 3.6+ library. It can be installed with:

pip install nevergrad

You can also install the master branch instead of the latest release with:

pip install git+

Alternatively, you can clone the repository and run pip install -e . from inside the repository folder.

By default, this only installs requirements for the optimization and instrumentation subpackages. If you are also interested in the benchmarking part, you should install with the [benchmark] flag (example: pip install 'nevergrad[benchmark]'), and if you also want the test tools, use the [all] flag (example: pip install -e '.[all]')

Goals and structure

The goals of this package are to provide:

  • gradient/derivative-free optimization algorithms, including algorithms able to handle noise.
  • tools to instrument any code, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete variables.
  • functions on which to test the optimization algorithms.
  • benchmark routines in order to compare algorithms easily.

The structure of the package follows its goal, you will therefore find subpackages:

  • optimization: implementing optimization algorithms
  • instrumentation: tooling to convert code into a well-defined function to optimize.
  • functions: implementing both simple and complex benchmark functions
  • benchmark: for running experiments comparing the algorithms on benchmark functions
  • common: a set of tools used throughout the package

Example of optimization

Convergence of a population of points to the minima with two-points DE.


The following README is very general, here are links to find more details on:

  • how to perform optimization using nevergrad, including using parallelization and a few recommendation on which algorithm should be used depending on the settings
  • how to instrument functions with any kind of parameters in order to convert them into a function defined on a continuous vectorial space where optimization can be performed. It also provides a tool to instantiate a script or non-python code in order into a Python function and be able to tune some of its parameters.
  • how to benchmark all optimizers on various test functions.
  • benchmark results of some standard optimizers an simple test cases.
  • examples of optimization for machine learning.
  • how to contribute through issues and pull requests and how to setup your dev environment.
  • guidelines of how to contribute by adding a new algorithm.

Basic optimization example

All optimizers assume a centered and reduced prior at the beginning of the optimization (i.e. 0 mean and unitary standard deviation). They are however able to find solutions far from this initial prior.

Optimizing (minimizing!) a function using an optimizer (here OnePlusOne) can be easily run with:

import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.OnePlusOne(instrumentation=2, budget=100)
recommendation = optimizer.optimize(square)
print(recommendation)  # optimal args and kwargs
>>> Candidate(args=(array([0.500, 0.499]),), kwargs={})

recommendation holds the optimal attributes args and kwargs found by the optimizer for the provided function. In this example, the optimal value will be found in recommendation.args[0] and will be a np.ndarray of size 2.

instrumentation=n is a shortcut to state that the function has only one variable, of dimension n, See the instrumentation tutorial for more complex instrumentations.

You can print the full list of optimizers with:

import nevergrad as ng

The optimization documentation contains more information on how to use several workers, take full control of the optimization through the ask and tell interface and some pieces of advice on how to choose the proper optimizer for your problem.


    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{}},


nevergrad is released under the MIT license. See LICENSE for additional details.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for nevergrad, version 0.2.3
Filename, size File type Python version Upload date Hashes
Filename, size nevergrad-0.2.3-py3-none-any.whl (175.3 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size nevergrad-0.2.3.tar.gz (129.0 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page