Saliency methods for TensorFlow

# Saliency Methods

## Introduction

This repository contains code for the following saliency techniques:

*Developed by PAIR.

This list is by no means comprehensive. We are accepting pull requests to add new methods!

``````pip install saliency
``````

or for the development version:

``````git clone https://github.com/pair-code/saliency
cd saliency
``````

## Usage

Each saliency mask class extends from the `SaliencyMask` base class. This class contains the following methods:

• `__init__(graph, session, y, x)`: Constructor of the SaliencyMask. This can modify the graph, or sometimes create a new graph. Often this will add nodes to the graph, so this shouldn't be called continuously. `y` is the output tensor to compute saliency masks with respect to, `x` is the input tensor with the outer most dimension being batch size.
• `GetMask(x_value, feed_dict)`: Returns a mask of the shape of non-batched `x_value` given by the saliency technique.
• `GetSmoothedMask(x_value, feed_dict)`: Returns a mask smoothed of the shape of non-batched `x_value` with the SmoothGrad technique.

The visualization module contains two visualization methods:

• `VisualizeImageGrayscale(image_3d, percentile)`: Marginalizes across the absolute value of each channel to create a 2D single channel image, and clips the image at the given percentile of the distribution. This method returns a 2D tensor normalized between 0 to 1.
• `VisualizeImageDiverging(image_3d, percentile)`: Marginalizes across the value of each channel to create a 2D single channel image, and clips the image at the given percentile of the distribution. This method returns a 2D tensor normalized between -1 to 1 where zero remains unchanged.

If the sign of the value given by the saliency mask is not important, then use `VisualizeImageGrayscale`, otherwise use `VisualizeImageDiverging`. See the SmoothGrad paper for more details on which visualization method to use.

## Examples

This example iPython notebook shows these techniques is a good starting place.

Another example of using GuidedBackprop with SmoothGrad from TensorFlow:

``````from guided_backprop import GuidedBackprop
import visualization

...
# Tensorflow graph construction here.
y = logits
x = tf.placeholder(...)
...

# Compute guided backprop.
# NOTE: This creates another graph that gets cached, try to avoid creating many
# of these.
guided_backprop_saliency = GuidedBackprop(graph, session, y, x)

...
image = GetImagePNG(...)
...

# Compute a 2D tensor for visualization.
grayscale_visualization = visualization.VisualizeImageGrayscale(
``````

This is not an official Google product.

## Project details

This version 0.0.5 0.0.4 0.0.3 0.0.2 0.0.1