Skip to main content

The purpose of this package is to provide multivariable optimizers using SPSA.

Project description

Simultaneous Perturbation Stochastic Optimization (SPSA)

The purpose of this package is to provide multivariable optimizers using SPSA. Although other optimizers exist, not many implement SPSA, which has various pros and cons. Additionally, SPSA has few requirements so that you don't have to install large packages like scipy just to optimize a function.

Usage

Synchronous Functions:

x = spsa.optimize(f, x)
x = spsa.optimize(spsa.maximize(f), x)  # For maximization.

for variables in spsa.optimize_iterator(f, x):
    print(variables)

Asynchronous Functions:

x = await spsa.aio.optimize(f, x)
x = await spsa.aio.optimize(spsa.aio.maximize(f), x)  # For maximization.

async for variables in spsa.aio.optimize(f, x):
    print(variables)

Example

import numpy as np
import spsa

# Sample function which has a minimum at 0.
def sphere(x: np.ndarray) -> float:
    return np.linalg.norm(x) ** 2

# Attempt to find the minimum.
print(spsa.optimize(sphere, [1, 2, 3]))
# Sample result:
#     [-5.50452777e-21 -9.48070248e-21  9.78726993e-21]

Pros & Cons

A comparison of SPSA, Gradient Descent, and Bayesian Optimization are shown below.

SPSA Gradient Descent Bayesian Optimization
Calls per Iteration Constant[1] f(x) 1 fprime(x) Constant f(x)
Stochastic Stochastic f Stochastic fprime Stochastic f
Convergence Local Local Global
Dimensions Any Any <20
Lines of Code ~100 10-100 >100
Integer Optimization Applicable[2] Inapplicable Applicable[3]
Parallel Calls Applicable[4] Not Obvious Applicable

[1]: Normally requires only 2 calls, but linear search and noise-adjusting perturbation sizes require a few extra calls per iteration.

[2]: Use f(round(x)), px=0.5, and px_decay=0.

[3]: Use a different Gaussian process.

[4]: See spsa.aio.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spsa-0.0.1.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

spsa-0.0.1-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file spsa-0.0.1.tar.gz.

File metadata

  • Download URL: spsa-0.0.1.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.1 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.5

File hashes

Hashes for spsa-0.0.1.tar.gz
Algorithm Hash digest
SHA256 23303b3cdc4684832067a1a2412118abf0b1aaf03f174635a802235faec7ea6e
MD5 8c13d7c058e33813ccf641a61a63e943
BLAKE2b-256 52e6c73a5a2a29eed39aef633e7c0966e3c3f1485cf33dce2a5767a94deab562

See more details on using hashes here.

Provenance

File details

Details for the file spsa-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: spsa-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.1 pkginfo/1.8.2 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.5

File hashes

Hashes for spsa-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ba389afd793f6911e01fe2c8a349095de563ba736745909e5d9369ec68070c1e
MD5 c731e733e5b03d4b20e88bc67d9841b2
BLAKE2b-256 3074ea9f595959286731797a6105e81562fea5798a4f9cfe70f6bf9cc82db4ec

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page