Skip to main content

An Adam-like optimizer for neural networks with adaptive estimation of learning rate

Project description

Prodigy: An Expeditiously Adaptive Parameter-Free Learner

This is the official repository of our paper that proposed the Prodigy optimizer. Currently, the code is only in PyTorch.

Prodigy: An Expeditiously Adaptive Parameter-Free Learner K. Mishchenko, A. Defazio Paper: https://arxiv.org/pdf/2306.06101.pdf

How to cite

If you find our work useful, please consider citing our paper.

@article{mishchenko2023prodigy,
    title={Prodigy: An Expeditiously Adaptive Parameter-Free Learner},
    author={Mishchenko, Konstantin and Defazio, Aaron},
    journal={arXiv preprint arXiv:2306.06101},
    year={2023},
    url={https://arxiv.org/pdf/2306.06101.pdf}
}

Project details


Release history Release notifications | RSS feed

This version

1.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prodigyopt-1.0.tar.gz (5.3 kB view hashes)

Uploaded Source

Built Distribution

prodigyopt-1.0-py3-none-any.whl (5.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page