Skip to main content

A general purpose PyTorch Optimizer

Project description

MADGRAD Optimization Method

A Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic Optimization

pip install madgrad

Try it out! A best-of-both-worlds optimizer with the generalization performance of SGD and at least as fast convergence as that of Adam, often faster. A drop-in torch.optim implementation madgrad.MADGRAD is provided, as well as a FairSeq wrapped instance. For FairSeq, just import madgrad anywhere in your project files and use the --optimizer madgrad command line option, together with --weight-decay, --momentum, and optionally --madgrad_eps.

The madgrad.py file containing the optimizer can be directly dropped into any PyTorch project if you don't want to install via pip. If you are using fairseq, you need the acompanying fairseq_madgrad.py file as well.

Documentation availiable at https://madgrad.readthedocs.io/en/latest/.

Things to note:

  • You may need to use a lower weight decay than you are accustomed to. Often 0.
  • You should do a full learning rate sweep as the optimal learning rate will be different from SGD or Adam. Best LR values we found were 2.5e-4 for 152 layer PreActResNet on CIFAR10, 0.001 for ResNet-50 on ImageNet, 0.025 for IWSLT14 using transformer_iwslt_de_en and 0.005 for RoBERTa training on BookWiki using BERT_BASE. On NLP models gradient clipping also helped.

Tech Report

Adaptivity without Compromise: A Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic Optimization

We introduce MADGRAD, a novel optimization method in the family of AdaGrad adaptive gradient methods. MADGRAD shows excellent performance on deep learning optimization problems from multiple fields, including classification and image-to-image tasks in vision, and recurrent and bidirectionally-masked models in natural language processing. For each of these tasks, MADGRAD matches or outperforms both SGD and ADAM in test set performance, even on problems for which adaptive methods normally perform poorly.

@misc{defazio2021adaptivity,
      title={Adaptivity without Compromise: A Momentumized, Adaptive, Dual Averaged Gradient Method for Stochastic Optimization}, 
      author={Aaron Defazio and Samy Jelassi},
      year={2021},
      eprint={2101.11075},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Results

vision nlp

License

MADGRAD is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

madgrad-1.1.tar.gz (5.2 kB view details)

Uploaded Source

Built Distribution

madgrad-1.1-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file madgrad-1.1.tar.gz.

File metadata

  • Download URL: madgrad-1.1.tar.gz
  • Upload date:
  • Size: 5.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/54.2.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.9

File hashes

Hashes for madgrad-1.1.tar.gz
Algorithm Hash digest
SHA256 f753c16f0421e53feaffd4d93cba23a431b3a5c1e687e1e28e71e2bdc6c3e6f4
MD5 ec61627badc19433b0439ded891de1fc
BLAKE2b-256 1bc9cec5a0f4e9554cb2de65b895d3068ac4f03beee0636ff57ba00fa1f546bc

See more details on using hashes here.

File details

Details for the file madgrad-1.1-py3-none-any.whl.

File metadata

  • Download URL: madgrad-1.1-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/54.2.0 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.9

File hashes

Hashes for madgrad-1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9b9c433c4ef7aaf7d5cc75506ac0af65c47ccccc153b1a1aa4f72dad00cbd0b4
MD5 93527e2e031232e117ad87bb389a516a
BLAKE2b-256 65f04584f18202a2fb8903d456bf907b80e7cb54ad8fcba68604084ff41b7cf8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page