Keras Activations and Gradients
Project description
Keract: Keras Activations + Gradients
pip install keract
You have just found a (easy) way to get the activations (outputs) and gradients for each layer of your Keras model (LSTM, conv nets...).
API
Get activations (outputs of each layer)
from keract import get_activations
get_activations(model, x)
Inputs are:
model
is akeras.models.Model
object.x
is a numpy array to feed to the model as input. In the case of multi-input,x
is of type List. We use the Keras convention (as used in predict, fit...).
The output is a dictionary containing the activations for each layer of model
for the input x:
{
'conv2d_1/Relu:0': np.array(...),
'conv2d_2/Relu:0': np.array(...),
...,
'dense_2/Softmax:0': np.array(...)
}
The key is the name of the layer and the value is the corresponding output of the layer for the given input x
.
Get gradients of weights
model
is akeras.models.Model
object.x
Input data (numpy array). Keras convention.y
: Labels (numpy array). Keras convention.
from keract import get_gradients_of_trainable_weights
get_gradients_of_trainable_weights(model, x, y)
The output is a dictionary mapping each trainable weight to the values of its gradients (regarding x and y).
Get gradients of activations
model
is akeras.models.Model
object.x
Input data (numpy array). Keras convention.y
: Labels (numpy array). Keras convention.
from keract import get_gradients_of_activations
get_gradients_of_activations(model, x, y)
The output is a dictionary mapping each layer to the values of its gradients (regarding x and y).
Examples
Examples are provided for:
keras.models.Sequential
- mnist.pykeras.models.Model
- multi_inputs.py- Recurrent networks - recurrent.py
In the case of MNIST with LeNet, we are able to fetch the activations for a batch of size 128:
conv2d_1/Relu:0
(128, 26, 26, 32)
conv2d_2/Relu:0
(128, 24, 24, 64)
max_pooling2d_1/MaxPool:0
(128, 12, 12, 64)
dropout_1/cond/Merge:0
(128, 12, 12, 64)
flatten_1/Reshape:0
(128, 9216)
dense_1/Relu:0
(128, 128)
dropout_2/cond/Merge:0
(128, 128)
dense_2/Softmax:0
(128, 10)
We can visualise the activations. Here's another example using VGG16:
cd examples
python vgg16.py
A cat.
Outputs of the first convolutional layer of VGG16.
Also, we can visualise the heatmaps of the activations:
cd examples
python heat_map.py
Repo views (since 2018/10/31)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for keract-2.4.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fb1f370302c6d9621b7b599cb725ad44f0c8c0d1375336ac36c14c49d6b66bb6 |
|
MD5 | a02b9d670d6a901372124ba9dce3edd4 |
|
BLAKE2b-256 | 3326f2c5ebbad38a0dc061ea67099d61831002c036ef02595fbaa0c53eb7ac25 |