An implementation of adan optimization algorithm for optax.
Project description
optax-adan
An implementation of adan optimizer for optax based on Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models.
Collab with usage example can be found here.
How to use:
Install the package:
python3 -m pip install optax-adan
Import the optimizer:
from optax_adan import adan
Use it as you would use any other optimizer from optax:
# init
optimizer = adan(learning_rate=0.01)
optimizer_state = optimizer.init(initial_params)
# step
grad = grad_func(params)
updates, optimizer_state = optimizer.update(grad, optimizer_state, params)
params = optax.apply_updates(params, updates)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
optax-adan-0.1.4.tar.gz
(7.6 kB
view hashes)
Built Distribution
Close
Hashes for optax_adan-0.1.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fa1f472258057d3572e6644688a53add5d7fbb57137a5f57ca311d86487a387c |
|
MD5 | df54a14df74225406c6dfa9ef8603356 |
|
BLAKE2b-256 | 9e8bef5d8557e2588f28ec55d47dfe0e37f5c0d239df563f8c3e2b2727886206 |