Skip to main content

RAdam implemented in Tensorflow 1.x

Project description

RAdam-Tensorflow

On the Variance of the Adaptive Learning Rate and Beyond

Paper | Official Pytorch code

Usage

from RAdam import RAdamOptimizer

train_op = RAdamOptimizer(learning_rate=0.001, beta1=0.9, beta2=0.999, weight_decay=0.0).minimize(loss)

Algorithm

Result

result

Author

Junho Kim

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tf_1.x_rectified_adam-0.0.2-py3-none-any.whl (4.4 kB view details)

Uploaded Python 3

File details

Details for the file tf_1.x_rectified_adam-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: tf_1.x_rectified_adam-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 4.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.6.12

File hashes

Hashes for tf_1.x_rectified_adam-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0541a62722ee99ae62044fcfcd685602c8e2d2dee4419dbcf20b85404cf8ed9d
MD5 061d2a6808fded0c60b09b9bffb7f07b
BLAKE2b-256 036ee0485e1a6c93bee9105c7dec62ee484dfdd7b1b5e879ca04a2362281d3e9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page