Skip to main content

AdaMod optimization algorithm, build on PyTorch.

Project description

# AdaMod

An optimizer which exerts adaptive momental upper bounds on individual learning rates to prevent them becoming undesirably lager than what the historical statistics suggest and avoid the non-convergence issue, thus to a better performance. Strong empirical results on many deep learning applications demonstrate the effectiveness of our proposed method especially on complex networks such as DenseNet and Transformer.

<p align=’center’><img src=’img/Loss.bmp’ width=”100%”/></p>

## Installation

AdaMod requires Python 3.6.0 or later.

### Installing via pip

The preferred way to install AdaMod is via pip with a virtual environment. Just run `bash pip install adamod ` in your Python environment and you are ready to go!

### Using source code

As AdaMod is a Python class with only 100+ lines, an alternative way is directly downloading [adamod.py](./adamod/adamod.py) and copying it to your project.

## Usage

You can use AdaMod just like any other PyTorch optimizers.

`python3 optimizer = adamod.AdaMod(model.parameters(), lr=1e-3, beta3=0.999) ` As described in the paper, AdaMod can smooths out unexpected large learning rates throughout the training process. The beta3 parameter is the smoothing coefficient for actual learning rate, which controls the average range. In common cases, a beta3 in {0.999,0.9999} can achieve relatively good and stable results. See the paper for more details.

## Demos

For the full list of demos, please refer to [this page](./demos).

## Contributors

[@luoruixuan](https://github.com/luoruixuan)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

adamod-0.0.3.tar.gz (3.3 kB view details)

Uploaded Source

Built Distribution

adamod-0.0.3-py3-none-any.whl (5.7 kB view details)

Uploaded Python 3

File details

Details for the file adamod-0.0.3.tar.gz.

File metadata

  • Download URL: adamod-0.0.3.tar.gz
  • Upload date:
  • Size: 3.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/36.5.0.post20170921 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.3

File hashes

Hashes for adamod-0.0.3.tar.gz
Algorithm Hash digest
SHA256 6e542b2982ad183ea7879671661d19acb581f8037adcb879464df6887d6df108
MD5 c420903e9a8f8d89b8175b7f481b817f
BLAKE2b-256 233842505c02b17039057ea63b5047ab8ce7348ecf840161d0c865fb40543326

See more details on using hashes here.

File details

Details for the file adamod-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: adamod-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 5.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/36.5.0.post20170921 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.6.3

File hashes

Hashes for adamod-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 fb77ec37f93d930e13adfa9646d4dfb63e91d8311df5b4dd826bdb677340fcc8
MD5 7e91c66e891d4df1aaa702085f956524
BLAKE2b-256 088436499b6bd4b0bd06670210b636216780699488f6c7666f766dc6f164db7d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page