Keras Activations and Gradients
Project description
Keract: Keras Activations + Gradients
pip install keract
You have just found a (easy) way to get the activations (outputs) and gradients for each layer of your Keras model (LSTM, conv nets...).
API
Get activations (outputs of each layer)
from keract import get_activations
get_activations(model, x)
Inputs are:
model
is akeras.models.Model
object.x
is a numpy array to feed to the model as input. In the case of multi-input,x
is of type List. We use the Keras convention (as used in predict, fit...).
The output is a dictionary containing the activations for each layer of model
for the input x:
{
'conv2d_1/Relu:0': np.array(...),
'conv2d_2/Relu:0': np.array(...),
...,
'dense_2/Softmax:0': np.array(...)
}
The key is the name of the layer and the value is the corresponding output of the layer for the given input x
.
Get gradients of weights
model
is akeras.models.Model
object.x
Input data (numpy array). Keras convention.y
: Labels (numpy array). Keras convention.
from keract import get_gradients_of_trainable_weights
get_gradients_of_trainable_weights(model, x, y)
The output is a dictionary mapping each trainable weight to the values of its gradients (regarding x and y).
Get gradients of get_gradients_of_activations
model
is akeras.models.Model
object.x
Input data (numpy array). Keras convention.y
: Labels (numpy array). Keras convention.
from keract import get_gradients_of_activations
get_gradients_of_activations(model, x, y)
The output is a dictionary mapping each layer to the values of its gradients (regarding x and y).
Examples
Examples are provided for:
keras.models.Sequential
- mnist.pykeras.models.Model
- multi_inputs.py- Recurrent networks - recurrent.py
In the case of MNIST with LeNet, we are able to fetch the activations for a batch of size 128:
conv2d_1/Relu:0
(128, 26, 26, 32)
conv2d_2/Relu:0
(128, 24, 24, 64)
max_pooling2d_1/MaxPool:0
(128, 12, 12, 64)
dropout_1/cond/Merge:0
(128, 12, 12, 64)
flatten_1/Reshape:0
(128, 9216)
dense_1/Relu:0
(128, 128)
dropout_2/cond/Merge:0
(128, 128)
dense_2/Softmax:0
(128, 10)
We can even visualise some of them.
A random seven from MNIST
Activation map of CONV1 of LeNet
Activation map of FC1 of LeNet
Activation map of Softmax of LeNet. Yes it's a seven!
Repo views (since 2018/10/31)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for keract-2.1.1-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7579ed06467a2443ce8b85e88e9c67584c1991ea9de7e61034eb6d69677370cd |
|
MD5 | 75825e2b8b92ea6b8c6ff2355a7d45a2 |
|
BLAKE2b-256 | 5159275a98570b5446dded5fe4cbf38204c9485777b13ffc78d156e871de7fc7 |