Skip to main content
Join the official 2019 Python Developers SurveyStart the survey!

No project description provided

Project description

Implementation of for large batch, large learning rate training.

The paper doesn't specify clamp values for ϕ, so I use 10.

Bonus: TensorboardX logging (example below).

Try the sample

git clone
cd pytorch-lamb
pip install -e .
tensorboard --logdir=runs

Sample results

At --lr=.02, the Adam optimizer is unable to train.

Red: python --batch-size=512 --lr=.02 --wd=.01 --log-interval=30 --optimizer=adam

Blue: python --batch-size=512 --lr=.02 --wd=.01 --log-interval=30 --optimizer=lamb

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for pytorch-lamb, version 1.0.0
Filename, size File type Python version Upload date Hashes
Filename, size pytorch_lamb-1.0.0-py3-none-any.whl (4.4 kB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size pytorch_lamb-1.0.0.tar.gz (3.1 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page