Skip to main content

A Python toolbox for performing gradient-free optimization

Project description

Support Ukraine CircleCI

Nevergrad - A gradient-free optimization platform

Nevergrad

nevergrad is a Python 3.8+ library. It can be installed with:

pip install nevergrad

More installation options, including windows installation, and complete instructions are available in the "Getting started" section of the documentation.

You can join Nevergrad users Facebook group here.

Minimizing a function using an optimizer (here NGOpt) is straightforward:

import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.NGOpt(parametrization=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value)  # recommended value
>>> [0.49971112 0.5002944]

nevergrad can also support bounded continuous variables as well as discrete variables, and mixture of those. To do this, one can specify the input space:

import nevergrad as ng

def fake_training(learning_rate: float, batch_size: int, architecture: str) -> float:
    # optimal for learning_rate=0.2, batch_size=4, architecture="conv"
    return (learning_rate - 0.2)**2 + (batch_size - 4)**2 + (0 if architecture == "conv" else 10)

# Instrumentation class is used for functions with multiple inputs
# (positional and/or keywords)
parametrization = ng.p.Instrumentation(
    # a log-distributed scalar between 0.001 and 1.0
    learning_rate=ng.p.Log(lower=0.001, upper=1.0),
    # an integer from 1 to 12
    batch_size=ng.p.Scalar(lower=1, upper=12).set_integer_casting(),
    # either "conv" or "fc"
    architecture=ng.p.Choice(["conv", "fc"])
)

optimizer = ng.optimizers.NGOpt(parametrization=parametrization, budget=100)
recommendation = optimizer.minimize(fake_training)

# show the recommended keyword arguments of the function
print(recommendation.kwargs)
>>> {'learning_rate': 0.1998, 'batch_size': 4, 'architecture': 'conv'}

Learn more on parametrization in the documentation!

Example of optimization

Convergence of a population of points to the minima with two-points DE.

Documentation

Check out our documentation! It's still a work in progress, so don't hesitate to submit issues and/or pull requests (PRs) to update it and make it clearer! The last version of our data and the last version of our PDF report.

Citing

@misc{nevergrad,
    author = {J. Rapin and O. Teytaud},
    title = {{Nevergrad - A gradient-free optimization platform}},
    year = {2018},
    publisher = {GitHub},
    journal = {GitHub repository},
    howpublished = {\url{https://GitHub.com/FacebookResearch/Nevergrad}},
}

License

nevergrad is released under the MIT license. See LICENSE for additional details about it. See also our Terms of Use and Privacy Policy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nevergrad-1.0.12.tar.gz (413.0 kB view details)

Uploaded Source

Built Distribution

nevergrad-1.0.12-py3-none-any.whl (506.3 kB view details)

Uploaded Python 3

File details

Details for the file nevergrad-1.0.12.tar.gz.

File metadata

  • Download URL: nevergrad-1.0.12.tar.gz
  • Upload date:
  • Size: 413.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for nevergrad-1.0.12.tar.gz
Algorithm Hash digest
SHA256 ade90dacbb676f6dfc9fee3d4f9c628ec29eec9e3d995a31026ae8a61988b803
MD5 83b84f87531d107822023d72ff060761
BLAKE2b-256 d7678ff218de679bdaa6f12c0bb7c7491b6121e3bd46656166f7364800b188ad

See more details on using hashes here.

File details

Details for the file nevergrad-1.0.12-py3-none-any.whl.

File metadata

  • Download URL: nevergrad-1.0.12-py3-none-any.whl
  • Upload date:
  • Size: 506.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for nevergrad-1.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 56ff65d6a2f497ecd79af5a796968ee946c05705a6a69ca616eae5988cc5d999
MD5 d6104e4ef1a83553068e77593f1413d4
BLAKE2b-256 5a138267afdb84a890d7fc3e6f0eef170b0323915c28879e79e8184f7257cf8a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page