Attribution of Neural Networks using PyTorch
Project description
Zennit
Zennit (Zennit explains neural networks in torch) is a
high-level framework in Python using Pytorch for explaining/exploring neural
networks. Its design philosophy is intended to provide high customizability and
integration as a standardized solution for applying rule-based attribution
methods in research, with a strong focus on Layerwise Relevance Propagation
(LRP). Zennit strictly requires models to use Pytorch's torch.nn.Module
structure (including activation functions).
Zennit is currently under active development, but should be mostly stable.
If you find Zennit useful for your research, please consider citing our related paper:
@article{anders2021software,
author = {Anders, Christopher J. and
Neumann, David and
Samek, Wojciech and
Müller, Klaus-Robert and
Lapuschkin, Sebastian},
title = {Software for Dataset-wide XAI: From Local Explanations to Global Insights with {Zennit}, {CoRelAy}, and {ViRelAy}},
journal = {CoRR},
volume = {abs/2106.13200},
year = {2021},
}
Documentation
The latest documentation is hosted at zennit.readthedocs.io.
Install
To install directly from PyPI using pip, use:
$ pip install zennit
Alternatively, install from a manually cloned repository to try out the examples:
$ git clone https://github.com/chr5tphr/zennit.git
$ pip install ./zennit
Usage
At its heart, Zennit registers hooks at Pytorch's Module level, to modify the
backward pass to produce rule-based attributions like LRP (instead of the usual
gradient). All rules are implemented as hooks
(zennit/rules.py
) and most use the LRP basis
BasicHook
(zennit/core.py
).
Composites (zennit/composites.py
) are a way
of choosing the right hook for the right layer. In addition to the abstract
NameMapComposite, which assigns hooks to layers by name, and
LayerMapComposite, which assigns hooks to layers based on their Type, there
exist explicit Composites, some of which are EpsilonGammaBox
(ZBox
in
input, Epsilon
in dense, Gamma
in convolutions) or EpsilonPlus
(Epsilon
in dense, ZPlus
in convolutions). All composites may be used by directly
importing from zennit.composites
, or by using their snake-case name as key
for zennit.composites.COMPOSITES
.
Canonizers (zennit/canonizers.py
) temporarily
transform models into a canonical form, if required, like
SequentialMergeBatchNorm
, which automatically detects and merges BatchNorm
layers followed by linear layers in sequential networks, or
AttributeCanonizer
, which temporarily overwrites attributes of applicable
modules, e.g. to handle the residual connection in ResNet-Bottleneck modules.
Attributors (zennit/attribution.py
) directly
execute the necessary steps to apply certain attribution methods, like the
simple Gradient
, SmoothGrad
or Occlusion
. An optional Composite may
be passed, which will be applied during the Attributor's execution to
compute the modified gradient, or hybrid methods.
Using all of these components, an LRP-type attribution for VGG16 with batch-norm layers with respect to label 0 may be computed using:
import torch
from torchvision.models import vgg16_bn
from zennit.composites import EpsilonGammaBox
from zennit.canonizers import SequentialMergeBatchNorm
from zennit.attribution import Gradient
data = torch.randn(1, 3, 224, 224)
model = vgg16_bn()
canonizers = [SequentialMergeBatchNorm()]
composite = EpsilonGammaBox(low=-3., high=3., canonizers=canonizers)
with Gradient(model=model, composite=composite) as attributor:
out, relevance = attributor(data, torch.eye(1000)[[0]])
A similar setup using the example script produces the following attribution heatmaps:
For more details and examples, have a look at our documentation.
More Example Heatmaps
More heatmaps of various attribution methods for VGG16 and ResNet50, all
generated using
share/example/feed_forward.py
, can be found
below.
Heatmaps for VGG16
Heatmaps for ResNet50
Contributing
See CONTRIBUTING.md for detailed instructions on how to contribute.
License
Zennit is licensed under the GNU LESSER GENERAL PUBLIC LICENSE VERSION 3 OR LATER -- see the LICENSE, COPYING and COPYING.LESSER files for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file zennit-0.5.0.tar.gz
.
File metadata
- Download URL: zennit-0.5.0.tar.gz
- Upload date:
- Size: 1.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 640647500099d2d18ee4157ff96cd924ef34dff97e1a39afda987e482a95c087 |
|
MD5 | 056f182f6c8ba33dd1d15bafb1e24c67 |
|
BLAKE2b-256 | 052ce7f6b73d0fb9687053d8a406ccc7387cc0d0dc03e872e4fe5c7b0516771b |
File details
Details for the file zennit-0.5.0-py3-none-any.whl
.
File metadata
- Download URL: zennit-0.5.0-py3-none-any.whl
- Upload date:
- Size: 53.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.9.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5cee1e6ba95a574bc2ef3c6876802b390290f1d37a2fff6a748ad317e531a2b2 |
|
MD5 | 3e856359515621a88f468c7491f65907 |
|
BLAKE2b-256 | 8b145172eebda07dd90626b41a4396368aa51e00c432cd144e892eb1c3b3bdd7 |