Skip to main content

An implementation of adan optimization algorithm for optax.

Project description

optax-adan

An implementation of adan optimizer for optax based on Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models.

Collab with usage example can be found here.

How to use:

Install the package:

python3 -m pip install optax-adan

Import the optimizer:

from optax_adan import adan

Use it as you would use any other optimizer from optax:

# init
optimizer = adan(learning_rate=0.01)
optimizer_state = optimizer.init(initial_params)
# step
grad = grad_func(params)
updates, optimizer_state = optimizer.update(grad, optimizer_state, params)
params = optax.apply_updates(params, updates)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optax-adan-0.1.5.tar.gz (7.7 kB view details)

Uploaded Source

File details

Details for the file optax-adan-0.1.5.tar.gz.

File metadata

  • Download URL: optax-adan-0.1.5.tar.gz
  • Upload date:
  • Size: 7.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.13

File hashes

Hashes for optax-adan-0.1.5.tar.gz
Algorithm Hash digest
SHA256 a0a7de368d3b84f5b61f36293945d6d64b30fa284c2098f74fc21349a6273d38
MD5 f22d61a29f17f9ee367f47258c54194b
BLAKE2b-256 4a804423ad06bd3bc95e8bd16a0d487abe50193e33d920be7df7aa0a56b5de28

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page