Skip to main content

Neural network visualization toolkit for tf.keras

Project description

tf-keras-vis

Downloads PyPI version Python package License: MIT

Notes

We've released v0.7.0! In this release, the gradient calculation of ActivationMaximization is changed for the sake of fixing a critical problem. Although the calculation result are now a bit different compared to the past versions, you could avoid it by using legacy implementation as follows:

# from tf_keras_vis.activation_maximization import ActivationMaximization
from tf_keras_vis.activation_maximization.legacy import ActivationMaximization

In addition to above, we've also fixed some problems related Regularizers. Although we newly provide tf_keras_vis.activation_maximization.regularizers module that includes the regularizers whose bugs are fixed, like ActivationMaximization, you could also use legacy implementation as follows:

# from tf_keras_vis.activation_maximization.regularizers import Norm, TotalVariation2D 
from tf_keras_vis.utils.regularizers import Norm, TotalVariation2D

Please see the release note for details. If you face any problem related to this release, please feel free to ask us in Issues page!

Overview

tf-keras-vis is a visualization toolkit for debugging tf.keras.Model in Tensorflow2.0+. Currently supported methods for visualization include:

tf-keras-vis is designed to be light-weight, flexible and ease of use. All visualizations have the features as follows:

  • Support N-dim image inputs, that's, not only support pictures but also such as 3D images.
  • Support batch wise processing, so, be able to efficiently process multiple input images.
  • Support the model that have either multiple inputs or multiple outputs, or both.
  • Support the mixed-precision model.

And in ActivationMaximization,

  • Support Optimizers that are built to tf.keras.

Visualizations

Visualizing Dense Layer

Visualizing Convolutional Filer

GradCAM

The images above are generated by GradCAM++.

Saliency Map

The images above are generated by SmoothGrad.

Usage

  • ActivationMaximization (Visualizing Convolutional Filter)
from matplotlib import pyplot as plt
from tf_keras_vis.activation_maximization import ActivationMaximization
from tf_keras_vis.activation_maximization.callbacks import Progress
from tf_keras_vis.activation_maximization.input_modifiers import Jitter, Rotate2D
from tf_keras_vis.activation_maximization.regularizers import TotalVariation2D, Norm
from tf_keras_vis.utils.model_modifiers import ExtractIntermediateLayer, ReplaceToLinear
from tf_keras_vis.utils.scores import CategoricalScore

# Create the visualization instance.
# All visualization classes accept a model and model-modifier, which, for example,
#     replaces the activation of last layer to linear function so on, in constructor.
activation_maximization = \
   ActivationMaximization(YOUR_MODEL_INSTANCE,
                          model_modifier=[ExtractIntermediateLayer('layer_name'),
                                          ReplaceToLinear()],
                          clone=False)

# You can use Score class to specify visualizing target you want.
# And add regularizers or input-modifiers as needed.
activations = \
   activation_maximization(CategoricalScore(FILTER_INDEX),
                           steps=200,
                           input_modifiers=[Jitter(jitter=16), Rotate2D(degree=1)],
                           regularizers=[TotalVariation2D(weight=1.0),
                                         Norm(weight=0.3, p=1)],
                           optimizer=tf.keras.optimizers.RMSprop(1.0, 0.999),
                           callbacks=[Progress()])

## Since v0.6.0, calling `astype()` is NOT necessary.
# activations = activations[0].astype(np.uint8)

# Render
plt.imshow(activations[0])
  • Gradcam++
from matplotlib import pyplot as plt
from matplotlib import cm
from tf_keras_vis.gradcam_plus_plus import GradcamPlusPlus
from tf_keras_vis.utils.model_modifiers import ReplaceToLinear
from tf_keras_vis.utils.scores import CategoricalScore

# Create GradCAM++ object
gradcam = GradcamPlusPlus(YOUR_MODEL_INSTANCE,
                          model_modifier=ReplaceToLinear(),
                          clone=True)

# Generate cam with GradCAM++
cam = gradcam(CategoricalScore(CATEGORICAL_INDEX),
              SEED_INPUT,
              penultimate_layer=-1)

## Since v0.6.0, calling `normalize()` is NOT necessary.
# cam = normalize(cam)

plt.imshow(SEED_INPUT_IMAGE)
heatmap = np.uint8(cm.jet(cam[0])[..., :3] * 255)
plt.imshow(heatmap, cmap='jet', alpha=0.5) # overlay

Please see the guides below for more details:

Getting Started Guides

[NOTE] If you have ever used keras-vis, you may feel that tf-keras-vis is similar with keras-vis. Actually tf-keras-vis derived from keras-vis, and both provided visualization methods are almost the same. But please notice that tf-keras-vis APIs does NOT have compatibility with keras-vis.

Requirements

  • Python 3.6-3.9
  • tensorflow>=2.0.4

Installation

  • PyPI
$ pip install tf-keras-vis tensorflow
  • Source (for development)
$ git clone https://github.com/keisen/tf-keras-vis.git
$ cd tf-keras-vis
$ pip install -e .[develop]

Use Cases

  • chitra
    • A Deep Learning Computer Vision library for easy data loading, model building and model interpretation with GradCAM/GradCAM++.

Known Issues

  • With InceptionV3, ActivationMaximization doesn't work well, that's, it might generate meaninglessly blur image.
  • With cascading model, Gradcam and Gradcam++ don't work well, that's, it might occur some error. So we recommend to use FasterScoreCAM in this case.
  • channels-first models and data is unsupported.

ToDo

  • Guides
    • Visualizing multiple attention or activation images at once utilizing batch-system of model
    • Define various score functions
    • Visualizing attentions with multiple inputs models
    • Visualizing attentions with multiple outputs models
    • Advanced score functions
    • Tuning Activation Maximization
    • Visualizing attentions for N-dim image inputs
  • Publish API documentations as a website
  • We're going to add some methods such as below
    • Deep Dream
    • Style transfer

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tf-keras-vis-0.7.1.tar.gz (38.6 kB view details)

Uploaded Source

Built Distribution

tf_keras_vis-0.7.1-py3-none-any.whl (52.4 kB view details)

Uploaded Python 3

File details

Details for the file tf-keras-vis-0.7.1.tar.gz.

File metadata

  • Download URL: tf-keras-vis-0.7.1.tar.gz
  • Upload date:
  • Size: 38.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.9.5

File hashes

Hashes for tf-keras-vis-0.7.1.tar.gz
Algorithm Hash digest
SHA256 ac801f6aa4442294ba1b10c8b5e348e1c7c9da7e744be96db2ee9b47f28688fb
MD5 c6d57d0555d1edfd62143ee7a6ca64a6
BLAKE2b-256 240ebc9339d087a331e90031529fdb2b7c039c2e79b8f06d4663a477fae9159d

See more details on using hashes here.

File details

Details for the file tf_keras_vis-0.7.1-py3-none-any.whl.

File metadata

  • Download URL: tf_keras_vis-0.7.1-py3-none-any.whl
  • Upload date:
  • Size: 52.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.6.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.1 CPython/3.9.5

File hashes

Hashes for tf_keras_vis-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7f8c23c472aa38afc0aab592da0370ad136b505e625ba148997c23c1888f7e98
MD5 a7a3e9a6b6162146add6fedd665b956e
BLAKE2b-256 65cf1b3783885906676ed13fb5d5851a4ddd1c9f37be548d78e777ceaeed4c4f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page