An Adam-like optimizer for neural networks with adaptive estimation of learning rate
Project description
Prodigy: An Expeditiously Adaptive Parameter-Free Learner
This is the official repository of our paper that proposed the Prodigy optimizer. Currently, the code is only in PyTorch.
Prodigy: An Expeditiously Adaptive Parameter-Free Learner K. Mishchenko, A. Defazio Paper: https://arxiv.org/pdf/2306.06101.pdf
How to cite
If you find our work useful, please consider citing our paper.
@article{mishchenko2023prodigy,
title={Prodigy: An Expeditiously Adaptive Parameter-Free Learner},
author={Mishchenko, Konstantin and Defazio, Aaron},
journal={arXiv preprint arXiv:2306.06101},
year={2023},
url={https://arxiv.org/pdf/2306.06101.pdf}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
prodigyopt-1.0.tar.gz
(5.3 kB
view hashes)