Skip to main content

pytorch-optimizer

Project description

pytorch-optimizer

Bunch of optimizer implementations in PyTorch with clean-code, strict types. Inspired by pytorch-optimizer.

Usage

Supported Optimizers

Optimizer Description Official Code Paper
AdamP Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights github https://arxiv.org/abs/2006.08217
Adaptive Gradient Clipping (AGC) High-Performance Large-Scale Image Recognition Without Normalization github https://arxiv.org/abs/2102.06171
Chebyshev LR Schedules Acceleration via Fractal Learning Rate Schedules github https://arxiv.org/abs/2103.01338v1
Gradient Centralization (GC) A New Optimization Technique for Deep Neural Networks github https://arxiv.org/abs/2004.01461
Lookahead k steps forward, 1 step back github https://arxiv.org/abs/1907.08610v2
RAdam On the Variance of the Adaptive Learning Rate and Beyond github https://arxiv.org/abs/1908.03265
Ranger a synergistic optimizer combining RAdam and LookAhead, and now GC in one optimizer github
Ranger21 integrating the latest deep learning components into a single optimizer github

Citations

AdamP
@inproceedings{heo2021adamp,
    title={AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights},
    author={Heo, Byeongho and Chun, Sanghyuk and Oh, Seong Joon and Han, Dongyoon and Yun, Sangdoo and Kim, Gyuwan and Uh, Youngjung and Ha, Jung-Woo},
    year={2021},
    booktitle={International Conference on Learning Representations (ICLR)},
}
Adaptive Gradient Clipping (AGC)
@article{brock2021high,
  author={Andrew Brock and Soham De and Samuel L. Smith and Karen Simonyan},
  title={High-Performance Large-Scale Image Recognition Without Normalization},
  journal={arXiv preprint arXiv:2102.06171},
  year={2021}
}
Chebyshev LR Schedules
@article{agarwal2021acceleration,
  title={Acceleration via Fractal Learning Rate Schedules},
  author={Agarwal, Naman and Goel, Surbhi and Zhang, Cyril},
  journal={arXiv preprint arXiv:2103.01338},
  year={2021}
}
Gradient Centralization (GC)
@inproceedings{yong2020gradient,
  title={Gradient centralization: A new optimization technique for deep neural networks},
  author={Yong, Hongwei and Huang, Jianqiang and Hua, Xiansheng and Zhang, Lei},
  booktitle={European Conference on Computer Vision},
  pages={635--652},
  year={2020},
  organization={Springer}
}
Lookahead
@article{zhang2019lookahead,
  title={Lookahead optimizer: k steps forward, 1 step back},
  author={Zhang, Michael R and Lucas, James and Hinton, Geoffrey and Ba, Jimmy},
  journal={arXiv preprint arXiv:1907.08610},
  year={2019}
}
RAdam
@inproceedings{liu2019radam,
 author = {Liu, Liyuan and Jiang, Haoming and He, Pengcheng and Chen, Weizhu and Liu, Xiaodong and Gao, Jianfeng and Han, Jiawei},
 booktitle = {Proceedings of the Eighth International Conference on Learning Representations (ICLR 2020)},
 month = {April},
 title = {On the Variance of the Adaptive Learning Rate and Beyond},
 year = {2020}
}

Author

Hyeongchan Kim / @kozistr

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-optimizer-0.0.1.tar.gz (19.2 kB view details)

Uploaded Source

Built Distribution

pytorch_optimizer-0.0.1-py3-none-any.whl (21.7 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-optimizer-0.0.1.tar.gz.

File metadata

  • Download URL: pytorch-optimizer-0.0.1.tar.gz
  • Upload date:
  • Size: 19.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.11

File hashes

Hashes for pytorch-optimizer-0.0.1.tar.gz
Algorithm Hash digest
SHA256 fa8c7b111cc8de044fbd5187532589cd7e966f177743b358af40eb618d6fcf02
MD5 ab763c8ac5845219de58bb4a3b7c8daa
BLAKE2b-256 37cd1e2e260c2682bef84ec9c161b7f3aeded486ffdc95a0d6b7454e5ac6e793

See more details on using hashes here.

File details

Details for the file pytorch_optimizer-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: pytorch_optimizer-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 21.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.11

File hashes

Hashes for pytorch_optimizer-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f4708f1bc64c6bed02282badfe81170f6c8dd26663aa8efb6f6002f58589caac
MD5 e20beea609dc94d5993e0ceb5dbcc142
BLAKE2b-256 633b417ebc52ab2c98f6724f1ae56b4378944b4a10dfeceaf4a6b22273a2cfc9

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page