BackPACK: Packing more into backprop

# BackPACK: Packing more into backprop

BackPACK is built on top of PyTorch. It efficiently computes quantities other than the gradient.

Provided quantities include:

• Individual gradients from a mini-batch
• Estimates of the gradient variance or second moment
• Approximate second-order information (diagonal and Kronecker approximations)

Motivation: Computation of most quantities is not necessarily expensive (often just a small modification of the existing backward pass where backpropagated information can be reused). But it is difficult to do in the current software environment.

## Installation

pip install backpack-for-pytorch


## Contributing

BackPACK is actively being developed. We are appreciating any help. If you are considering to contribute, do not hesitate to contact us. An overview of the development procedure is provided in the developer README.

## How to cite

If you are using BackPACK, consider citing the paper

@inproceedings{dangel2020backpack,
title     = {Back{PACK}: Packing more into Backprop},
author    = {Felix Dangel and Frederik Kunstner and Philipp Hennig},
booktitle = {International Conference on Learning Representations},
year      = {2020},
url       = {https://openreview.net/forum?id=BJlrF24twB}
}


## Release history Release notifications | RSS feed

Uploaded source
Uploaded py3