Skip to main content

Pytorch optimizer based on nonlinear conjugate gradient method

Project description

NCG-optimizer is a set of optimizer about nonliear conjugate gradient in Pytorch.

Install

$ pip install ncg_optimizer

Supported Optimizers

Basic Methods

The theoretical analysis and implementation of all basic methods is based on the “Nonlinear Conjugate Gradient Method” [1] , “Numerical Optimization” ([2] [3]) and “Conjugate gradient algorithms in nonconvex optimization”[4].

Linear Conjugate Gradient

The linear conjugate gradient(LCG) method is only applicable to linear equation solving problems. It converts linear equations into quadratic functions, so that the problem can be solved iteratively without inverting the coefficient matrix.

https://raw.githubusercontent.com/RyunMi/NCG-optimizer/master/docs/LCG.png
from ncg_optimizer import LCG

# model = Your Model
optimizer = LCG(model.parameters(), eps=1e-5)
def closure():
    optimizer.zero_grad()
    loss_fn(model(input), target).backward()
    return loss_fn
optimizer.step(closure)

Fletcher-Reeves

References

Changes

0.1.0 (2023-02-05)

  • Initial release.

  • Added support for LCG, FR and two line search function(Armijo & Wolfe).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ncg-optimizer-0.0.2b0.tar.gz (9.7 kB view hashes)

Uploaded Source

Built Distribution

ncg_optimizer-0.0.2b0-py3-none-any.whl (11.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page