No project description provided
Project description
# Pytorch Sparse AdamW This repository contains the sparse version of AdamW optimizer.
The SparseAdamW optimizer behaves like AdamW optimizer, but updates only the statistics for gradients which are computed, in the same way as SparseAdam optimizer. The optimizer can only be used on modules, which produce sparse gradients, e.g., nn.Embedding.
## Install Install by running: `bash pip install torch-sparse-adamw `
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for torch_sparse_adamw-1.0.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0729195d888716ecdfd45cfc7f6d807589c0e4942e4a8256c36b9b4c49b4fba1 |
|
MD5 | 5aa7f3936f3618292cf5c9097b9a2eda |
|
BLAKE2b-256 | d99f1811265caceb27ffe317d30e7e033fcc9679cfa34966ad5b970e4fd382b9 |