Skip to main content

A parallel version of the L-BFGS-B optimizer of scipy.optimize.minimize().

Project description

optimparallel - A parallel version of scipy.optimize.minimize(method='L-BFGS-B')

PyPI Build Status Codacy Badge License: GPL v3

Using optimparallel.minimize_parallel() can significantly reduce the optimization time. For an objective function with an execution time of more than 0.1 seconds and p parameters the optimization speed increases by up to factor 1+p when no analytic gradient is specified and 1+p processor cores with sufficient memory are available.

A similar extension of the L-BFGS-B optimizer exists in the R package optimParallel:

Installation

To install the package run:

$ pip install optimparallel

Usage

Replace scipy.optimize.minimize(method='L-BFGS-B') by optimparallel.minimize_parallel() to execute the minimization in parallel:

from optimparallel import minimize_parallel
from scipy.optimize import minimize
import numpy as np
import time

## objective function
def f(x, sleep_secs=.5):
    print('fn')
    time.sleep(sleep_secs)
    return sum((x-14)**2)

## start value
x0 = np.array([10,20])

## minimize with parallel evaluation of 'fun' and
## its approximate gradient.
o1 = minimize_parallel(fun=f, x0=x0, args=.5)
print(o1)

## test against scipy.optimize.minimize()
o2 = minimize(fun=f, x0=x0, args=.5, method='L-BFGS-B')
print(all(np.isclose(o1.x, o2.x, atol=1e-10)),
      np.isclose(o1.fun, o2.fun, atol=1e-10),
      all(np.isclose(o1.jac, o2.jac, atol=1e-10)))

The evaluated x values, fun(x), and jac(x) can be returned:

o1 = minimize_parallel(fun=f, x0=x0, args=.5, parallel={'loginfo': True})
print(o1.loginfo)

More examples are given in example.py.

Note for Windows users: It may be necessary to run minimize_parallel() in the main scope. See example_windows_os.py.

Author

Contributions

Contributions via pull requests are welcome.

To install devel requirements run:

$ pip install optimparallel

Thanks to contributors:

  • Lewis Blake

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimparallel-0.0.4.tar.gz (22.5 kB view details)

Uploaded Source

Built Distribution

optimparallel-0.0.4-py3-none-any.whl (24.6 kB view details)

Uploaded Python 3

File details

Details for the file optimparallel-0.0.4.tar.gz.

File metadata

  • Download URL: optimparallel-0.0.4.tar.gz
  • Upload date:
  • Size: 22.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for optimparallel-0.0.4.tar.gz
Algorithm Hash digest
SHA256 aec80a872e104146f5b234e4274b04c2013512ab8a14a970ea003556baa7113c
MD5 76e92af2a51f89562d2f029f6b98b28a
BLAKE2b-256 9945525a94f5803a41eda27b683ec9c20eaa6b468610df563c37b68c2fdee429

See more details on using hashes here.

File details

Details for the file optimparallel-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: optimparallel-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 24.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for optimparallel-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 1049de6fd319dbd844cd5ae0d4148def04366017783c889eb192d668543d4f9e
MD5 7db205a927b9d55a60a39eb32c7854ba
BLAKE2b-256 394c025fc43622b33571f95dbfbcfad0aae04fe7a546bf39c3374523b2b7598a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page