Keras Activations and Gradients
Project description
Keract: Keras Activations + Gradients
pip install keract
You have just found a (easy) way to get the activations (outputs) and gradients for each layer of your Keras model (LSTM, conv nets...).
API
Get activations (outputs of each layer)
from keract import get_activations
get_activations(model, x)
Inputs are:
model
is akeras.models.Model
object.x
is a numpy array to feed to the model as input. In the case of multi-input,x
is of type List. We use the Keras convention (as used in predict, fit...).
The output is a dictionary containing the activations for each layer of model
for the input x:
{
'conv2d_1/Relu:0': np.array(...),
'conv2d_2/Relu:0': np.array(...),
...,
'dense_2/Softmax:0': np.array(...)
}
The key is the name of the layer and the value is the corresponding output of the layer for the given input x
.
Get gradients of weights
model
is akeras.models.Model
object.x
Input data (numpy array). Keras convention.y
: Labels (numpy array). Keras convention.
from keract import get_gradients_of_trainable_weights
get_gradients_of_trainable_weights(model, x, y)
The output is a dictionary mapping each trainable weight to the values of its gradients (regarding x and y).
Get gradients of activations
model
is akeras.models.Model
object.x
Input data (numpy array). Keras convention.y
: Labels (numpy array). Keras convention.
from keract import get_gradients_of_activations
get_gradients_of_activations(model, x, y)
The output is a dictionary mapping each layer to the values of its gradients (regarding x and y).
Examples
Examples are provided for:
keras.models.Sequential
- mnist.pykeras.models.Model
- multi_inputs.py- Recurrent networks - recurrent.py
In the case of MNIST with LeNet, we are able to fetch the activations for a batch of size 128:
conv2d_1/Relu:0
(128, 26, 26, 32)
conv2d_2/Relu:0
(128, 24, 24, 64)
max_pooling2d_1/MaxPool:0
(128, 12, 12, 64)
dropout_1/cond/Merge:0
(128, 12, 12, 64)
flatten_1/Reshape:0
(128, 9216)
dense_1/Relu:0
(128, 128)
dropout_2/cond/Merge:0
(128, 128)
dense_2/Softmax:0
(128, 10)
We can visualise the activations. Here's another example using VGG16:
cd examples
python vgg16.py
A cat.
Outputs of the first convolutional layer of VGG16.
Also, we can visualise the heatmaps of the activations:
cd examples
python heat_map.py
Repo views (since 2018/10/31)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for keract-2.5.4-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6c8fc7c74fc37a8ec0626fd6d85a93b692699e0f40b418df2dcf544b4f66257c |
|
MD5 | 538bfa604d81d6642e96fa6822f6dfe4 |
|
BLAKE2b-256 | 0a48abf19d8831357d87266aba859e9f82fc253c77a710b9bc58fc9685949752 |