Pytorch optimizer based on nonlinear conjugate gradient method
Project description
NCG-optimizer is a set of optimizer about nonliear conjugate gradient in Pytorch.
Install
$ pip install ncg_optimizer
Supported Optimizers
Basic Methods
The theoretical analysis and implementation of all basic methods is based on the “Nonlinear Conjugate Gradient Method” [1] , “Numerical Optimization” ([2] [3]) and “Conjugate gradient algorithms in nonconvex optimization”[4].
Linear Conjugate Gradient
The linear conjugate gradient(LCG) method is only applicable to linear equation solving problems. It converts linear equations into quadratic functions, so that the problem can be solved iteratively without inverting the coefficient matrix.
from ncg_optimizer import LCG
# model = Your Model
optimizer = LCG(model.parameters(), eps=1e-5)
def closure():
optimizer.zero_grad()
loss_fn(model(input), target).backward()
return loss_fn
optimizer.step(closure)
Fletcher-Reeves
References
Changes
0.1.0 (2023-02-05)
Initial release.
Added support for LCG, FR and two line search function(Armijo & Wolfe).
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for ncg_optimizer-0.0.2b0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | c59e1859b3a0483bd6e5d7af00087bf8593702130ea48d5586bea94e18256df8 |
|
MD5 | d4f84283bf62471621eb62494c62eeee |
|
BLAKE2b-256 | 0fbb963e6c6191dca5862d87c3e2dcab9d59b90b0038e395829842f9e7d18aa3 |