Skip to main content

Frequency-domain model explanation (IG) package

Project description

freqIG

Overview

This repository contains the implementation of freqIG, a method based on the principle of FLEX (Frequency Layer Explanation) [1], designed to explain the predictions of deep neural networks (DNNs) for time-series classification tasks. freqIG combines Integrated Gradients (IG) with a frequency-domain transform (via the Real Fast Fourier Transform (RFFT)) to provide frequency-based attribution scores.

The method is generally useful for understanding how different frequency components of a time-series input influence the predictions of a DNN, thus enhancing model interpretability.

For details on the general concept, see [1]: "Using EEG Frequency Attributions to Explain the Classifications of a Deep Neural Network for Sleep Staging" (Paul Gräve et al.).


Features

  • RFFT Transformation: Input time-series data are transformed into the frequency domain using the RFFT.
  • iRFFT Transformation: The inverse RFFT (iRFFT) is implemented as the first layer in the DNN to process frequency-domain inputs.
  • Integrated Gradients Attribution: Captum's IG method is used to compute relevance scores for frequency bands, providing insights into the features contributing to the model's predictions.

Definition (FLEX principle)

Let F be our model (DNN) and x be our input (time-series data). Then with $\bar{F} = F \circ iRFFT$ and $\bar{x} = RFFT(x)$ we get
$$FLEX_i(F,x) = IG_i(\bar{F},\bar{x})$$,
where $FLEX(F,x) = (FLEX_1(F,x), ..., FLEX_n(F,x))$ with $x \in \mathbb{R}^n$.


Installation

Requirements

  • Python 3.8+
  • Required libraries:
    • numpy
    • torch
    • captum

Install Dependencies

You can install the required Python libraries using pip:

pip install numpy torch captum

Documentation

freqIG.attribute

Compute frequency-based attribution scores for a model predicting on time-series data.

freqIG.attribute(
    input: Union[np.ndarray, list, torch.Tensor],
    model: Any,
    target: Optional[int] = None,
    baseline: Optional[Union[np.ndarray, list, torch.Tensor]] = None,
    n_steps: int = 50,
    segment: Optional[Union[np.ndarray, list, torch.Tensor]] = None,
    start_idx: Optional[int] = None,
    additional_forward_args: Optional[Any] = None
) -> np.ndarray

Parameters

  • input : array-like or torch.Tensor
    The input time-series data.

  • model : callable
    The (frequency-domain) model to explain.

  • target : int, optional
    Index of the class to explain. If None, explains the model's predicted class.

  • baseline : array-like or torch.Tensor, optional
    Baseline input for Integrated Gradients. Defaults to zero input.

  • n_steps : int, default=50
    Number of steps in the IG path.

  • segment : array-like or torch.Tensor, optional
    Segment of the input for localized attribution.

  • start_idx : int, optional
    Start index of the segment within the original input.

  • additional_forward_args : Any, optional
    Additional arguments passed to the model during attribution.

Returns

  • np.ndarray
    Array containing the frequency attribution scores.

Raises

  • ValueError
    If segment is provided but start_idx is missing, or if the segment exceeds the bounds of the input.
  • ValueError
    If baseline is provided but its shape does not match the input.

Notes

This function applies Integrated Gradients in the frequency domain to provide frequency-wise attributions for any model acting on time-series data, following the FLEX [1] principle.

References

[1] Using EEG Frequency Attributions to Explain the Classifications of a Deep Neural Network for Sleep Staging
Paul Gräve, T. Steinbrinker, F. Ehrlich, P. Hempel, P. Zaschke, D. Krefting, N. Spicher; 2025.

Examples

import numpy as np
import torch
from freqIG import attribute

# Generate dummy time-series data: 20 samples, each of length 10
np.random.seed(0)
n_samples = 20
n_features = 10
X = np.linspace(0, 1, n_features)[None, :] + 0.1 * np.random.randn(n_samples, n_features)
y = np.random.randint(0, 2, size=(n_samples,))  # Binary classification

X_torch = torch.tensor(X, dtype=torch.float32)
y_torch = torch.tensor(y, dtype=torch.long)

# Simple frequency-domain model (2-class classifier)
class SimpleModel(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.fc = torch.nn.Linear(n_features, 2)
    def forward(self, x):
        return self.fc(x)

model = SimpleModel()
criterion = torch.nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.01)

# VERY simple training loop (just for demonstration!)
model.train()
for epoch in range(30):
    optimizer.zero_grad()
    outputs = model(X_torch)
    loss = criterion(outputs, y_torch)
    loss.backward()
    optimizer.step()
model.eval()

# Pick one sample for explanation
sample = X[0:1]

# Run freqIG.attribute to get attributions
attr_scores = attribute(
    input=sample,    # shape (1, 10)
    model=model,
    target=0,        # Explain class 0
    n_steps=30
)

print("Attribution scores:", attr_scores)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

freqig-0.1.2.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

freqig-0.1.2-py3-none-any.whl (6.6 kB view details)

Uploaded Python 3

File details

Details for the file freqig-0.1.2.tar.gz.

File metadata

  • Download URL: freqig-0.1.2.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for freqig-0.1.2.tar.gz
Algorithm Hash digest
SHA256 68a8192e595b2fc039bd7496d1aac3f1f04775822f9f3981775dff7835acaed4
MD5 8ed5154963896b18cff4f96c077125b1
BLAKE2b-256 629305a068caaac6377d963910f9ae320b7e9d8a2ec8909c3d996c3b7d2a1391

See more details on using hashes here.

File details

Details for the file freqig-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: freqig-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 6.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.5

File hashes

Hashes for freqig-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 6a072b02a596594851acee60fbf98908e9caa0d0bce3dba283791076bc414bdd
MD5 a6ff46449708458a011e55d38333799f
BLAKE2b-256 02c7af070385734427766d142d74e277308f3d443f2b620898ab9f08a30efe1e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page