Skip to main content

A parallel version of the L-BFGS-B optimizer of scipy.optimize.minimize().

Project description

optimparallel - A parallel version of scipy.optimize.minimize(method='L-BFGS-B')

PyPI Build Status Codacy Badge License: GPL v3

Using optimparallel.minimize_parallel() can significantly reduce the optimization time. For an objective function with an execution time of more than 0.1 seconds and p parameters the optimization speed increases by up to factor 1+p when no analytic gradient is specified and 1+p processor cores with sufficient memory are available.

A similar extension of the L-BFGS-B optimizer exists in the R package optimParallel:

Installation

To install the package run:

pip install optimparallel

Usage

Replace scipy.optimize.minimize(method='L-BFGS-B') by optimparallel.minimize_parallel() to execute the minimization in parallel:

from optimparallel import minimize_parallel
from scipy.optimize import minimize
import numpy as np
import time

## objective function
def f(x, sleep_secs=.5):
    print('fn')
    time.sleep(sleep_secs)
    return sum((x-14)**2)

## start value
x0 = np.array([10,20])

## minimize with parallel evaluation of 'fun' and
## its approximate gradient.
o1 = minimize_parallel(fun=f, x0=x0, args=.5)
print(o1)

## test against scipy.optimize.minimize()
o2 = minimize(fun=f, x0=x0, args=.5, method='L-BFGS-B')
print(all(np.isclose(o1.x, o2.x, atol=1e-10)),
      np.isclose(o1.fun, o2.fun, atol=1e-10),
      all(np.isclose(o1.jac, o2.jac, atol=1e-10)))

The evaluated x values, fun(x), and jac(x) can be returned:

o1 = minimize_parallel(fun=f, x0=x0, args=.5, parallel={'loginfo': True})
print(o1.loginfo)

More examples are given in example.py.

Author

Contributions

Contributions via pull requests are welcome.

Thanks to contributors:

  • Lewis Blake

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimparallel-0.0.3.tar.gz (22.1 kB view details)

Uploaded Source

Built Distribution

optimparallel-0.0.3-py3-none-any.whl (24.4 kB view details)

Uploaded Python 3

File details

Details for the file optimparallel-0.0.3.tar.gz.

File metadata

  • Download URL: optimparallel-0.0.3.tar.gz
  • Upload date:
  • Size: 22.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for optimparallel-0.0.3.tar.gz
Algorithm Hash digest
SHA256 1157defd4d8f264d63c36585c1b164919f5381b4707ffc07b0e6966cbcebfccc
MD5 3be9fc260f8ce61e91aae210a7dfb811
BLAKE2b-256 d92d5ea326649d68ca43d3ae766df52ac943a220279efcf648f0fdf73b92f4d6

See more details on using hashes here.

File details

Details for the file optimparallel-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: optimparallel-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 24.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.4

File hashes

Hashes for optimparallel-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 e6c1eb1cbdd35b675fd969432cbdeaf1b0084d9860840aaa830bcc90f70e11f5
MD5 8ea0fb4af0c2b5d1006397c1f045fc52
BLAKE2b-256 fec1fee9682b8403ffbf5a6f2329245f3b322280a5377f9f9b53b735401ce314

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page