Distributed Shampoo (Second order Optimizer for Deep Learning) Optax Optimizer
Project description
Optimization in machine learning, both theoretical and applied, is presently dominated by first-order gradient methods such as stochastic gradient descent. Second-order optimization methods, that involve second derivatives and/or second order statistics of the data, are far less prevalent despite strong theoretical properties, due to their prohibitive computation, memory and communication costs.
Here we present a scalable implementation of a second-order preconditioning method (concretely, a variant of full-matrix Adagrad) that provides significant convergence and wall-clock time improvements compared to conventional first-order methods on state-of-the-art deep models.
Paper preprints: https://arxiv.org/abs/2002.09018
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file optax_shampoo-0.0.6.tar.gz
.
File metadata
- Download URL: optax_shampoo-0.0.6.tar.gz
- Upload date:
- Size: 31.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0aa9ae859c7be2f29cc3e260931bf7c9560c4357a1fd3675478b20c6d099e3bb |
|
MD5 | 65bfa399c8007a3b89b3e262e5dd42e1 |
|
BLAKE2b-256 | f4ab5c370d310ace203de7ebf50f0800e1c36041de707e5c7374c978d8d727cf |
File details
Details for the file optax_shampoo-0.0.6-py3-none-any.whl
.
File metadata
- Download URL: optax_shampoo-0.0.6-py3-none-any.whl
- Upload date:
- Size: 32.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 935bf236149365e6afbaef5de2384008c78cb5a7d2ac74b0c75181fadf14c6aa |
|
MD5 | 179f8497a911f496081b02a7683d8fb2 |
|
BLAKE2b-256 | 803b6609f0b3a98527a7fe2c2b4b48944b455ab65a3c9223e86d3cd5cf823a5e |