Skip to main content

A parallel version of the L-BFGS-B optimizer of scipy.optimize.minimize().

Project description

optimparallel - A parallel version of scipy.optimize.minimize(method='L-BFGS-B')

PyPI Build Status Codacy Badge License: GPL v3

Using optimparallel.minimize_parallel() can significantly reduce the optimization time. For an objective function with an execution time of more than 0.1 seconds and p parameters the optimization speed increases by up to factor 1+p when no analytic gradient is specified and 1+p processor cores with sufficient memory are available.

A similar extension of the L-BFGS-B optimizer exists in the R package optimParallel:

Installation

To install the package run:

$ pip install optimparallel

Usage

Replace scipy.optimize.minimize(method='L-BFGS-B') by optimparallel.minimize_parallel() to execute the minimization in parallel:

from optimparallel import minimize_parallel
from scipy.optimize import minimize
import numpy as np
import time

## objective function
def f(x, sleep_secs=.5):
    print('fn')
    time.sleep(sleep_secs)
    return sum((x-14)**2)

## start value
x0 = np.array([10,20])

## minimize with parallel evaluation of 'fun' and
## its approximate gradient.
o1 = minimize_parallel(fun=f, x0=x0, args=.5)
print(o1)

## test against scipy.optimize.minimize()
o2 = minimize(fun=f, x0=x0, args=.5, method='L-BFGS-B')
print(all(np.isclose(o1.x, o2.x, atol=1e-10)),
      np.isclose(o1.fun, o2.fun, atol=1e-10),
      all(np.isclose(o1.jac, o2.jac, atol=1e-10)))

The evaluated x values, fun(x), and jac(x) can be returned:

o1 = minimize_parallel(fun=f, x0=x0, args=.5, parallel={'loginfo': True})
print(o1.loginfo)

More examples are given in example.py and example_extended.ipynb.

Note for Windows users: It may be necessary to run minimize_parallel() in the main scope. See example_windows_os.py.

Author

Contributions

Contributions via pull requests are welcome.

Thanks to contributors:

  • Lewis Blake

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimparallel-0.0.6.tar.gz (38.5 kB view details)

Uploaded Source

Built Distribution

optimparallel-0.0.6-py3-none-any.whl (19.6 kB view details)

Uploaded Python 3

File details

Details for the file optimparallel-0.0.6.tar.gz.

File metadata

  • Download URL: optimparallel-0.0.6.tar.gz
  • Upload date:
  • Size: 38.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for optimparallel-0.0.6.tar.gz
Algorithm Hash digest
SHA256 9306c18e8cd5c22e9ce46245fadfa23e23225d44878e6b397266e8593ad6b77a
MD5 1228306767cc35b24d05256f9a69f2b9
BLAKE2b-256 9bb21f650ab91c2b6bc52c94a61bc2936bdbc37cb243a6c12cb2e0d45ca5333d

See more details on using hashes here.

File details

Details for the file optimparallel-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: optimparallel-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 19.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for optimparallel-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 f41d3a2fd78819c790cd39ff28efaabec3f9f66edff664a302aeaced9e8009cf
MD5 99eb9ec80e6ca983cd3478889e5217e0
BLAKE2b-256 0845f6d1602ea93937a48fbc9ed037b908c1c48cc0024b8b229c15e2269c905f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page