Skip to main content

RAdam implemented in Keras & TensorFlow

Project description

Keras RAdam

Travis Coverage Version Downloads License


Unofficial implementation of RAdam in Keras and TensorFlow.


pip install keras-rectified-adam

External Link


import keras
import numpy as np
from keras_radam import RAdam

# Build toy model with RAdam optimizer
model = keras.models.Sequential()
model.add(keras.layers.Dense(input_shape=(17,), units=3))
model.compile(RAdam(), loss='mse')

# Generate toy data
x = np.random.standard_normal((4096 * 30, 17))
w = np.random.standard_normal((17, 3))
y =, w)

# Fit, y, epochs=5)

TensorFlow without Keras

from import RAdamOptimizer


Use Warmup

from keras_radam import RAdam

RAdam(total_steps=10000, warmup_proportion=0.1, min_lr=1e-5)

Q & A

About Correctness

The optimizer produces similar losses and weights to the official optimizer after 500 steps.

Use tf.keras or tf-2.0

Add TF_KERAS=1 to environment variables to use tensorflow.python.keras.

Use theano Backend

Add KERAS_BACKEND=theano to environment variables to enable theano backend.

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for keras-radam, version 0.15.0
Filename, size File type Python version Upload date Hashes
Filename, size keras-radam-0.15.0.tar.gz (11.7 kB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page