Skip to main content

calculate example-wise gradient

Project description

this repository is still under construction (2021/07/21)

ExGrads

This repository provides a hook script: calculating Example-wise Gradients efficiently.

Note

This script use the work as an important reference.
I think it is the great first step to handle per-example gradients efficiently.
I'd like to express my respect for the step.

Features of This Script

  • Calculate example-wise gradient efficiently
    There is no method calculating Hessian in contrast to the referenced work.
  • Handle general modules
    Including Linear, Conv2d, BatchNorm2d, and BatchNorm1d. More modules will be added soon.
  • How to use this script in practice
    1. Fast and Exact calculating $\text{tr}[\bold{H}]$
    2. other usages (comming soon in a month, I hope)
  • Less memory mode (WIP)

How to Use

import torch
import exgrads as ExGrads

batch,dim,label = 5,3,2
x = torch.randn(batch,dim)                                  #: inputs
y = torch.randint(low=0,high=label-1,size=(batch,))         #: outputs
model   = torch.nn.Sequential(torch.nn.Linear(dim, label))  #: PyTorch model
loss_fn = torch.nn.functional.cross_entropy                 #: loss function

ExGrads.add_hooks(model)
model.zero_grad()
loss_fn(model(x), y).backward()
ExGrads.compute_grad1(model)

# param.grad:     gradient averaged over the batch
# param.grad1[i]: gradient of i-th example
for param in model.parameters():
	assert(torch.allclose(param.grad1.sum(dim=0), param.grad))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ExGrads-0.1.1.tar.gz (3.7 kB view details)

Uploaded Source

Built Distribution

ExGrads-0.1.1-py3-none-any.whl (3.9 kB view details)

Uploaded Python 3

File details

Details for the file ExGrads-0.1.1.tar.gz.

File metadata

  • Download URL: ExGrads-0.1.1.tar.gz
  • Upload date:
  • Size: 3.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.10

File hashes

Hashes for ExGrads-0.1.1.tar.gz
Algorithm Hash digest
SHA256 36bc22295353bfc552f22bb3ef83a8988c4fc19c67af41036e7bc2c504ee5951
MD5 02984e7f0cbf12fd4fc086c378c22c7a
BLAKE2b-256 66f6e1e07bfdeec06c25a2381797d84e1c7c6fe8808a424032794c18b94be473

See more details on using hashes here.

File details

Details for the file ExGrads-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: ExGrads-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 3.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.10

File hashes

Hashes for ExGrads-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d2aea0487c2c08b0cd4aa22d53b73e32b4ef949bf1d7610fe271c20b9510a501
MD5 f1d4d8154320019c090e552d3daed7c1
BLAKE2b-256 ba499379ecae2e2d283e84437f08b5e1d45821559dcdaed0b9647860d8458dfa

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page