Pytorch optimizer based on nonlinear conjugate gradient method
Project description
NCG-optimizer is a set of optimizer about nonlinear conjugate gradient in Pytorch.
Install
$ pip install ncg_optimizer
Supported Optimizers
Basic Methods
The theoretical analysis and implementation of all basic methods is based on the “Nonlinear Conjugate Gradient Method” [1] , “Numerical Optimization” ([2] [3]) and “Conjugate gradient algorithms in nonconvex optimization”[4].
Linear Conjugate Gradient
The linear conjugate gradient(LCG) method is only applicable to linear equation solving problems. It converts linear equations into quadratic functions, so that the problem can be solved iteratively without inverting the coefficient matrix.
import ncg_optimizer as optim
# model = Your Model
optimizer = optim.LCG(model.parameters(), eps=1e-5)
def closure():
optimizer.zero_grad()
loss_fn(model(input), target).backward()
return loss_fn
optimizer.step(closure)
Nonlinear Conjugate Gradient
Fletcher-Reeves Method
The Fletcher-Reeves conjugate gradient method ( FR Method ) is the earliest nonlinear conjugate gradient method. It was obtained by Fletcher and Reeves in 1964 by extending the conjugate gradient method for solving linear equations to solve optimization problems.
The scalar parameter update formula of the FR method is as follows:
$$ \beta_k^{F R}=\frac{g_{k+1}^T g_{k+1}}{g_k^T g_k}$$
The convergence analysis of FR method is often closely related to its selected line search. The FR method of exact line search is used to converge the general nonconvex function. The FR method of strong Wolfe inexact line search method (c2 <= 0.5) is adopted to globally converge to the general nonconvex function. The generalized Wolfe or Armijo inexact line search FR method is globally convergent for general nonconvex functions.
import ncg_optimizer as optim
# model = Your Model
optimizer = optim.FR(
model.parameters(), line_search = 'Wolfe',
c1 = 1e-4, c2 = 0.5, lr = 0.5, eta = 5,)
def closure():
optimizer.zero_grad()
loss_fn(model(input), target).backward()
return loss_fn
optimizer.step(closure)
References
Changes
0.1.0 (2023-02-05)
Initial release.
Added support for LCG, FR, PRP, HS, CD, DY, LS, HZ, HS_DY and two line search function(Armijo & Wolfe).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ncg_optimizer-0.0.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 13d4ee0382c5ccb0d92b296d6935129492d50538669fc133966e85d2b9da4a74 |
|
MD5 | 44db10c1373adad6888ed59bc9a12727 |
|
BLAKE2b-256 | f7da6f2c58e749f676ed9506eba0d7fb6abd1a208726b0be2e42205ab1fa68fd |