An Adam-like optimizer for neural networks with adaptive estimation of learning rate
Project description
Prodigy: An Expeditiously Adaptive Parameter-Free Learner
This is the official repository of our paper that proposed the Prodigy optimizer. Currently, the code is only in PyTorch.
Prodigy: An Expeditiously Adaptive Parameter-Free Learner K. Mishchenko, A. Defazio Paper: https://arxiv.org/pdf/2306.06101.pdf
How to cite
If you find our work useful, please consider citing our paper.
@article{mishchenko2023prodigy,
title={Prodigy: An Expeditiously Adaptive Parameter-Free Learner},
author={Mishchenko, Konstantin and Defazio, Aaron},
journal={arXiv preprint arXiv:2306.06101},
year={2023},
url={https://arxiv.org/pdf/2306.06101.pdf}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
prodigyopt-1.0.tar.gz
(5.3 kB
view details)
Built Distribution
File details
Details for the file prodigyopt-1.0.tar.gz
.
File metadata
- Download URL: prodigyopt-1.0.tar.gz
- Upload date:
- Size: 5.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | cdbbd99e836fa6eb90afa49f5eb1a7760d634a15976e77e3e8114349abe910ac |
|
MD5 | 229d11d840f7b1ade99a1163b25df1dd |
|
BLAKE2b-256 | dc0ea7660562219fef53c0fd2ddc0d0b904e09d4efb4399796d4dce8d1f2e3d5 |
File details
Details for the file prodigyopt-1.0-py3-none-any.whl
.
File metadata
- Download URL: prodigyopt-1.0-py3-none-any.whl
- Upload date:
- Size: 5.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3ca9c44754949e344802fd14868007a05bf16fe4e025c39814e41b1491c783c9 |
|
MD5 | fadb9ad20c8c8ed2d0a21726492a77a5 |
|
BLAKE2b-256 | 46436c1b6dfaf9a864c0f9c8bad0d20ac6aa2135775c0093fbb4f8bb947edc70 |