Skip to main content

An implementation of PSGD-QUAD optimizer in PyTorch.

Project description

PSGD-QUAD

pip install quad-torch

An implementation of PSGD-QUAD for PyTorch.

import torch
from quad_torch import QUAD

model = torch.nn.Linear(10, 10)
optimizer = QUAD(
    model.parameters(),
    lr=0.001,
    lr_style="adam",  # "adam", "mu-p", or None
    momentum=0.95,
    weight_decay=0.1,
    preconditioner_lr=0.6,
    max_size_dense=8192,
    max_skew_dense=1.0,
    noise_scale=1e-8,
    normalize_grads=False,
    dtype=torch.bfloat16,
)

lr_style can be "adam" for adam-style scaling, "mu-p" for mu-p scaling based on sqrt(G.shape[-2]), or None for PSGD scaling of RMS=1.0.

Resources

Xi-Lin Li's repo: https://github.com/lixilinx/psgd_torch

PSGD papers and resources listed from Xi-Lin's repo

  1. Xi-Lin Li. Preconditioned stochastic gradient descent, arXiv:1512.04202, 2015. (General ideas of PSGD, preconditioner fitting losses and Kronecker product preconditioners.)
  2. Xi-Lin Li. Preconditioner on matrix Lie group for SGD, arXiv:1809.10232, 2018. (Focus on preconditioners with the affine Lie group.)
  3. Xi-Lin Li. Black box Lie group preconditioners for SGD, arXiv:2211.04422, 2022. (Mainly about the LRA preconditioner. See these supplementary materials for detailed math derivations.)
  4. Xi-Lin Li. Stochastic Hessian fittings on Lie groups, arXiv:2402.11858, 2024. (Some theoretical works on the efficiency of PSGD. The Hessian fitting problem is shown to be strongly convex on set ${\rm GL}(n, \mathbb{R})/R_{\rm polar}$.)
  5. Omead Pooladzandi, Xi-Lin Li. Curvature-informed SGD via general purpose Lie-group preconditioners, arXiv:2402.04553, 2024. (Plenty of benchmark results and analyses for PSGD vs. other optimizers.)

License

CC BY 4.0

This work is licensed under a Creative Commons Attribution 4.0 International License.

2024 Evan Walters, Omead Pooladzandi, Xi-Lin Li

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quad_torch-0.3.0.tar.gz (11.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

quad_torch-0.3.0-py3-none-any.whl (11.9 kB view details)

Uploaded Python 3

File details

Details for the file quad_torch-0.3.0.tar.gz.

File metadata

  • Download URL: quad_torch-0.3.0.tar.gz
  • Upload date:
  • Size: 11.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.9

File hashes

Hashes for quad_torch-0.3.0.tar.gz
Algorithm Hash digest
SHA256 ff8e6b8bff8780da6833c9d2ebfa6538bb0705cca1a3af170f8ccf34f66f60ca
MD5 2d1c1fa52ceccb89c4c5c5c1865ec7ad
BLAKE2b-256 6847ac3ca0da3c9c597d70effdecb54d22acca187cbb9fcff0033d712f908810

See more details on using hashes here.

File details

Details for the file quad_torch-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: quad_torch-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 11.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.9

File hashes

Hashes for quad_torch-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3bbc72000715ed3c2a3742358a5323ee9058500e8a75877f94f334c1faa65a67
MD5 85b905570683f70889a6384ee64d4765
BLAKE2b-256 f591dbb13e51ef0a338cc862c5ea956de00710de8ba189620f15c26209453d88

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page