Skip to main content

Adam-atan2 for Pytorch

Project description

Adam-atan2 - Pytorch

Implementation of the proposed Adam-atan2 optimizer in Pytorch

A multi-million dollar paper out of google deepmind proposes a small change to Adam update rule (using atan2) to remove the epsilon altogether for numerical stability and scale invariance

Install

$ pip install adam-atan2-pytorch

Usage

import torch
from torch import nn

# toy model

model = nn.Linear(10, 1)

# import AdamAtan2 and instantiate with parameters

from adam_atan2_pytorch import AdamAtan2

opt = AdamAtan2(model.parameters(), lr = 1e-4)

# forward and backwards

for _ in range(100):
  loss = model(torch.randn(10))
  loss.backward()

  # optimizer step

  opt.step()
  opt.zero_grad()

Citations

@inproceedings{Everett2024ScalingEA,
    title   = {Scaling Exponents Across Parameterizations and Optimizers},
    author  = {Katie Everett and Lechao Xiao and Mitchell Wortsman and Alex Alemi and Roman Novak and Peter J. Liu and Izzeddin Gur and Jascha Narain Sohl-Dickstein and Leslie Pack Kaelbling and Jaehoon Lee and Jeffrey Pennington},
    year    = {2024},
    url     = {https://api.semanticscholar.org/CorpusID:271051056}
}
@inproceedings{Kumar2023MaintainingPI,
    title   = {Maintaining Plasticity in Continual Learning via Regenerative Regularization},
    author  = {Saurabh Kumar and Henrik Marklund and Benjamin Van Roy},
    year    = {2023},
    url     = {https://api.semanticscholar.org/CorpusID:261076021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adam_atan2_pytorch-0.0.12.tar.gz (418.3 kB view details)

Uploaded Source

Built Distribution

adam_atan2_pytorch-0.0.12-py3-none-any.whl (6.6 kB view details)

Uploaded Python 3

File details

Details for the file adam_atan2_pytorch-0.0.12.tar.gz.

File metadata

  • Download URL: adam_atan2_pytorch-0.0.12.tar.gz
  • Upload date:
  • Size: 418.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for adam_atan2_pytorch-0.0.12.tar.gz
Algorithm Hash digest
SHA256 783219b00580a335c54c962fea48a0ae030a2d76de3b3a7246b38fc9571a0a51
MD5 d44cb1b17f4746d37adc394bf4e4c5e3
BLAKE2b-256 42fa63648736646a26ce5e2d04f4c5654ace518ed45d83823ee675498e9a1c33

See more details on using hashes here.

File details

Details for the file adam_atan2_pytorch-0.0.12-py3-none-any.whl.

File metadata

File hashes

Hashes for adam_atan2_pytorch-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 4ce49fc2739f4ed40ca265966b068dcadd0c25063b640c01ac5ea4813bbfdeff
MD5 582b3bf2b960463ea6d55ecbcc0ad1ca
BLAKE2b-256 fbd990b13800bdf1fca7495f9319cf4fa4cad021aad1a83e6831c1eba20c7570

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page