Skip to main content

The purpose of this package is to provide multivariable optimizers using SPSA.

Project description

Simultaneous Perturbation Stochastic Optimization (SPSA)

The purpose of this package is to provide multivariable optimizers using SPSA. Although other optimizers exist, not many implement SPSA, which has various pros and cons. Additionally, SPSA has few requirements so that you don't have to install large packages like scipy just to optimize a function.

PIP Install

Unix/macOS:

python3 -m pip install spsa

Windows:

py -m pip install spsa

Usage

Import:

import spsa

Synchronous Functions:

x = spsa.maximize(f, x)
x = spsa.minimize(f, x)

for variables in spsa.iterator.maximize(f, x):
    print(variables)

for variables in spsa.iterator.minimize(f, x):
    print(variables)

Asynchronous Functions:

# spsa.aio - Asynchronous IO.# Performs function calls concurrently every iteration.
# Useful for:
#     IO-bound functions.
#     Functions running in executors.
#     Running `spsa` asynchronously with other code (non-blocking).
# See `help(spsa.aio)` for more details.

x = await spsa.aio.maximize(async_def_f, x)
x = await spsa.aio.minimize(async_def_f, x)

async for variables in spsa.aio.iterator.maximize(async_def_f, x):
    print(variables)

async for variables in spsa.aio.iterator.minimize(async_def_f, x):
    print(variables)

Synchronous Functions with Multiprocessing:

# spsa.amp - Asynchronous Multiprocessing.
#     Running `spsa` asynchronously with other code (non-blocking).
#     Running `spsa` in an executor for efficiently running multiple at a time.
#     Not for improving a single `spsa` call.
# See `help(spsa.amp)` for more details.

x = await spsa.amp.maximize(def_f, x)
x = await spsa.amp.minimize(def_f, x)

async for variables in spsa.amp.iterator.maximize(def_f, x):
    print(variables)

async for variables in spsa.amp.iterator.minimize(def_f, x):
    print(variables)

Example

import numpy as np
import spsa

# Squared distance to 0.
def sphere(x: np.ndarray) -> float:
    return np.linalg.norm(x) ** 2

# Attempt to find the minimum.
print(spsa.minimize(sphere, [1, 2, 3]))

# Sample output:
#     [-5.50452777e-21 -9.48070248e-21  9.78726993e-21]

Pros & Cons

A comparison of SPSA, Gradient Descent, and Bayesian Optimization are shown below.

SPSA Gradient Descent Bayesian Optimization
Calls per Iteration Constant[1] f(x) 1 fprime(x) Constant f(x)
Stochastic Stochastic f Stochastic fprime Stochastic f
Convergence Local Local Global
Dimensions Any Any <20
Lines of Code ~100 10-100 >100
Integer Optimization Applicable[2] Inapplicable Applicable[3]
Parallel Calls Applicable[4] Not Obvious Applicable

[1]: Normally requires only 2 calls, but linear search and noise-adjusting perturbation sizes require a few extra calls per iteration.

[2]: Use f(round(x)).

[3]: Use a different Gaussian process.

[4]: See spsa.aio and spsa.amp.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spsa-0.1.0.tar.gz (12.0 kB view hashes)

Uploaded Source

Built Distribution

spsa-0.1.0-py3-none-any.whl (14.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page