Skip to main content

Pytorch optimizer based on nonlinear conjugate gradient method

Project description

NCG-optimizer is a set of optimizer about nonlinear conjugate gradient in Pytorch.

Install

$ pip install ncg_optimizer

Supported Optimizers

Basic Methods

The theoretical analysis and implementation of all basic methods is based on the “Nonlinear Conjugate Gradient Method” [1] , “Numerical Optimization” ([2] [3]) and “Conjugate gradient algorithms in nonconvex optimization”[4].

Linear Conjugate Gradient

The linear conjugate gradient(LCG) method is only applicable to linear equation solving problems. It converts linear equations into quadratic functions, so that the problem can be solved iteratively without inverting the coefficient matrix.

https://raw.githubusercontent.com/RyunMi/NCG-optimizer/master/docs/LCG.png
import ncg_optimizer as optim

# model = Your Model

optimizer = optim.LCG(model.parameters(), eps=1e-5)
def closure():
    optimizer.zero_grad()
    loss_fn(model(input), target).backward()
    return loss_fn
optimizer.step(closure)

Nonlinear Conjugate Gradient

https://raw.githubusercontent.com/RyunMi/NCG-optimizer/master/docs/NCG.png

Fletcher-Reeves Method

The Fletcher-Reeves conjugate gradient method ( FR Method ) is the earliest nonlinear conjugate gradient method. It was obtained by Fletcher and Reeves in 1964 by extending the conjugate gradient method for solving linear equations to solve optimization problems.

The scalar parameter update formula of the FR method is as follows:

$$ \beta_k^{F R}=\frac{g_{k+1}^T g_{k+1}}{g_k^T g_k}$$

The convergence analysis of FR method is often closely related to its selected line search. The FR method of exact line search is used to converge the general nonconvex function. The FR method of strong Wolfe inexact line search method (c2 <= 0.5) is adopted to globally converge to the general nonconvex function. The generalized Wolfe or Armijo inexact line search FR method is globally convergent for general nonconvex functions.

import ncg_optimizer as optim

# model = Your Model

optimizer = optim.FR(
    model.parameters(), line_search = 'Wolfe',
    c1 = 1e-4, c2 = 0.5, lr = 0.5, eta = 5,)
def closure():
    optimizer.zero_grad()
    loss_fn(model(input), target).backward()
    return loss_fn
optimizer.step(closure)

References

Changes

0.1.0 (2023-02-05)

  • Initial release.

  • Added support for LCG, FR, PRP, HS, CD, DY, LS, HZ, HS_DY and two line search function(Armijo & Wolfe).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ncg-optimizer-0.0.4.tar.gz (12.8 kB view hashes)

Uploaded Source

Built Distribution

ncg_optimizer-0.0.4-py3-none-any.whl (27.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page