Fork of second-order-random-search with scipy.minimize-like interface.
Project description
pySORS
Fork of https://github.com/adamsolomou/second-order-random-search, which implements algorithms described in:
Aurelien Lucchi, Antonio Orvieto, Adamos Solomou. On the Second-order Convergence Properties of Random Search Methods. In Neural Information Processing Systems (NeurIPS), 2021.
This fork implements a scipy.minimize-like interface for those methods.
Usage
import pysors
import numpy as np
def rosenbrock(arr):
x,y = arr
a = 1
b = 100
return (a - x) ** 2 + b * (y - x ** 2) ** 2
x0 = np.array([-3., -4.])
res = pysors.minimize(rosenbrock, x0 = x0, method = 'bds', stopval=1e-8)
print(res) # - optimization result, holds `x`, `value` attributes
print(res.x) # - solution array.
This can also be used step-wise in the following way:
opt = pysors.BDS()
for i in range(1000):
x = opt.step(rosenbrock, x)
print(x) # last solution array
print(rosenbrock(x)) # objective value at x
List of methods
STP
: Stochastic Three PointsBDS
: Basic Direct SearchAHDS
: Approximate Hessian Direct SearchRS
: Two-step random searchRSPI_FD
: Power Iteration Random SearchRSPI_SPSA
: Power Iteration Random Search with SPSA hessian estimation
References
If you found this useful, please consider citing author's paper:
@inproceedings{
lucchi2021randomsearch,
title={On the Second-order Convergence Properties of Random Search Methods},
author={Aurelien Lucchi and Antonio Orvieto and Adamos Solomou},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2021}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pysors-1.0.0.tar.gz
(18.6 kB
view hashes)
Built Distribution
pysors-1.0.0-py3-none-any.whl
(22.4 kB
view hashes)