Skip to main content

Learning Rate Free Learning for Adam, SGD and AdaGrad

Project description

D-Adaptation

Learning rate free learning for SGD, AdaGrad and Adam!

Details

The provided Pytorch Optimizer classes can be dropped into your project and used as normal.

  • Set the LR parameter to 1.0. This parameter is not ignored, rather, setting it larger to smaller will directly scale up or down the D-adapted learning rate.
  • If you encounter divergence early on, try change rho to match a reasonable warmup schedule rate for your problem.
  • Use the same learning rate scheduler you would normally use on the problem.
  • The Adam variant supports AdamW style weight decay, just set decouple=True. It is not turned on by default, so if you are replacing your adam implementation, make sure you use decoupled if necessary.
  • Use the log_every setting to see the learning rate being used (d*lr) and the current D bound.
  • Only the AdaGrad version supports sparse gradients.
  • The IP variants implement a tighter D bound.

Experimental results

vision vision vision vision vision vision vision vision vision vision

License

See the License file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dadaptation-1.2.tar.gz (7.6 kB view details)

Uploaded Source

File details

Details for the file dadaptation-1.2.tar.gz.

File metadata

  • Download URL: dadaptation-1.2.tar.gz
  • Upload date:
  • Size: 7.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/52.0.0.post20210125 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.3

File hashes

Hashes for dadaptation-1.2.tar.gz
Algorithm Hash digest
SHA256 938eccda83ee2c58f8db8647dfbef48ef597773a0a1301e6a21ba4b5b4855503
MD5 7e01ae397446a8e201767c436d6eeb68
BLAKE2b-256 d7dff221401d085bcdc4161933fc4f9153718afef197c51cdd0831fedc7ccfdd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page