A tool to analyze and understand time series in convolutional neural networks
Project description
TimeLens
A tool to better understand your convolutional neural network for time series.
Example Usage
Activation Maximization
Basic setup for activation maximization:
- Import all the relevant tools from the library
- Instantiate the AMTrainer class with your PyTorch CNN and the device you want to perform the actions on
- Create an AM input by using different methods (See: AM Inputs)
- Create an array of penalties (Order matters!) (See: AM Penalties)
- The one used here is the recommended one
- Perform AM for a whole layer or only for a single unit by calling the activation maximization methods on the class object
from timelens.am.am_trainer import AMTrainer
from timelens.am.am_inputs import sin_ts, square_ts
am_trainer = AMTrainer(cnn, device)
# c: channels | t: time series length | vmax: maximum amplitude | freq: frequency of sine
input_ts = sin_ts(c=6, t=125, vmax=0.3, freq=0.5)
penalties = [
{
'penalty': 'eap',
'weight': 0.1,
'th': 0.6827,
},
{
'penalty': 'l2',
'weight': 0.1,
},
{
'penalty': 'tv',
'weight': 0.1,
},
]
am_result = am_trainer.activation_maximization_layer(input_ts, 'conv_layers.conv0', penalties = penalties, th_clip = 0.6827, iterations = 5000)
am_result = am_trainer.activation_maximization(input_ts, 'conv_layers.conv0', 0, penalties = penalties, th_clip = 0.6827, iterations = 5000)
Evaluation
The following code assumes that you already ran the code in the previous section.
- Set the comparison data for dynamic time warping (train_data is torch.utils.data.TensorDataset here)
- Call the evaluation method with different evaluation calculations (See: AMResult)
- This function will consume a lot of RAM/VRAM (depending on device), so keep this in mind
- The amount of RAM/VRAM depends on the size of your comparison data
# the am_result variable from the previous example stores an object of the class AMResult
am_result.set_original_data(train_data.tensors[0])
mean_dtw_loss = am_result.evaluate('dtw')
Visualizations
This example shows how timelens can be used to get basic visualizations
- Import CNNViz
- Instantiate a CNNViz object using the PyTorch CNN and the device
- Retrieve kernel or feature map visualizations for the whole layer or for a specific unit
from timelens.cnn.cnn_viz import CNNViz
cnn_viz = CNNViz(cnn, device)
layer_name = 'conv_layers.conv0'
idx_kernel = 0
input_ts = ... # provide a training sample from your dataset here -> it will be fed through your network
kernels = cnn_viz.plot_layer_kernels(layer_name)
kernel = cnn_viz.plot_specific_kernel(layer_name, idx_kernel)
feature_maps = cnn_viz.plot_layer_feature_maps(layer_name, input_ts)
feature_map = cnn_viz.plot_specific_feature_maps(layer_name, idx_kernel, input_ts)
Raw Network
This example shows how timelens can be used to get basic raw data about your network
- Import CNNRaw
- Instantiate a CNNRaw object using the PyTorch CNN and the device
- Retrieve kernel data or feature maps for the whole layer or for a specific unit
from timelens.cnn.cnn_raw import CNNRaw
cnn_raw = CNNRaw(cnn, device)
layer_name = 'conv_layers.conv0'
idx_kernel = 0
input_ts = ... # provide a training sample from your dataset here -> it will be fed through your network
kernels = cnn_raw.get_layer_kernels(layer_name)
kernel = cnn_raw.get_specific_kernel(layer_name, idx_kernel)
feature_maps = cnn_raw.get_layer_feature_maps(layer_name, input_ts)
feature_map = cnn_raw.get_specific_feature_maps(layer_name, idx_kernel, input_ts)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
timelens-0.2.0a0.tar.gz
(12.1 kB
view details)
Built Distribution
File details
Details for the file timelens-0.2.0a0.tar.gz
.
File metadata
- Download URL: timelens-0.2.0a0.tar.gz
- Upload date:
- Size: 12.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.11.5 Linux/6.1.53-1-MANJARO
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 63cf24da07629cd23cc16e478843837488966f372293b50e0eb74482072c42bf |
|
MD5 | c10f20c9ed8246f898917681602b60e0 |
|
BLAKE2b-256 | 40eccdc70f54c9906d57932e341eadb67d5928722f476c0820c6f3fa2f2c90f0 |
File details
Details for the file timelens-0.2.0a0-py3-none-any.whl
.
File metadata
- Download URL: timelens-0.2.0a0-py3-none-any.whl
- Upload date:
- Size: 14.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.11.5 Linux/6.1.53-1-MANJARO
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3efeffcb49b29858e4fcbc8130df3b8213fffa4b49d8e95d3319a1da020a9de5 |
|
MD5 | 8a62e84d2af3478d86db160c6b904ec7 |
|
BLAKE2b-256 | ed04ccf65504ecf1a83a9bd7cbfc1cb4832d8ece6e919458be99e9b541208a64 |