Skip to main content

Novel Time-Frequency Analysis with Learnable Kernels (NTFA-LK): A minimal package with arbitrary kernels and learnable parameters

Project description

ntfa-lk

Minimal package providing NTFA-LK for time-frequency decomposition with arbitrary kernels. Available to install with pip as package ntfa-lk.

Quick install

# Basic install
python3 -m pip install ntfa-lk

⚠️ Important: Use the Polynomial Kernel

The polynomial kernel performs exceptionally well in any combination and should be included in your kernel configuration. We don't set it as the default to allow users flexibility in configuring the polynomial parameters (degree and offset) for their specific use case.

Recommended configuration:

kernel_names=['polynomial', 'gaussian']
gamma=[0.5, 0.5]

Examples

The examples/ folder contains complete working examples:

  • run_example.py - Basic NTFA-LK demonstration showing signal decomposition, time-frequency representation, and inverse transform with a noisy multi-frequency signal (3 Hz + 7 Hz + noise)

  • NTFAvsDWT.py - Machine learning comparison demonstrating NTFA-LK vs DWT (Discrete Wavelet Transform) as feature extractors for time series classification:

    • Extracts 2D time-frequency features using NTFA-LK
    • Extracts 2D wavelet coefficients using DWT
    • Trains separate CNNs on each feature type
    • Compares classification accuracy on ECG200 dataset
    • Visualizes feature representations and training curves

To run the examples:

cd examples/
python3 run_example.py        # Basic signal processing demo
python3 NTFAvsDWT.py          # ML classification comparison (requires aeon, pywt)

API Overview

Main Class: NTFALayer

NTFALayer(
    kernel_names=['polynomial', 'gaussian'],
    gamma=[0.5, 0.5],
    alpha=0.15,            # Alpha threshold
    beta=0.9,             # Beta threshold
    threshold_mode='hard', # 'hard' or 'soft' (differentiable)
    window_size=20,
    step_size=4,
    interp_factor=0.25,    # Cubic interpolation factor
    learn_alpha=False,     # Make alpha learnable
    learn_beta=False,      # Make beta learnable
    learn_sigmoid_temp=False,  # Make sigmoid temp learnable (soft mode only)
    sigmoid_temp=1.0       # Temperature for soft operations
)

All parameters are learnable via backpropagation.

Forward pass:

tfr = layer(signal)  # Returns time-frequency representation

Inverse transform:

recovered = layer.inverse_transform(tfr)  # Phase automatically stored
# Or provide phase explicitly:
recovered = layer.inverse_transform(tfr, tfr_phase)

Parameter Descriptions

alpha (default: 0.15)

Alpha threshold for final smoothing. Suppresses weak frequency components in the final time-frequency representation. Higher values result in more aggressive smoothing.

beta (default: 0.9)

Beta threshold for the smart minimum operation. Controls which frequency components participate in the smart minimum calculation. Higher values (closer to 1.0) make the filter more selective, only including the strongest frequency components.

gamma (default: equal weights)

Kernel mixing weights. Automatically normalized to sum to 1. For a hybrid kernel with two components, gamma=[0.5, 0.5] gives equal weight to each kernel.

threshold_mode (default: 'hard')

Controls differentiability of thresholding operations:

  • 'hard': Binary masking (faster, non-differentiable)
  • 'soft': Smooth gradients via LogSumExp (fully differentiable, enables learning)

sigmoid_temp (default: 1.0)

Temperature parameter for soft operations (only used when threshold_mode='soft'). Higher values approach hard thresholding. Can be made learnable with learn_sigmoid_temp=True.

Available Kernels

  • "polynomial" - Polynomial kernel: (x + offset)^degree
    • Default params: degree=2, offset=1.3
  • "gaussian" - Gaussian kernel: exp(-0.5 * ((x - center) / sigma)^2)
    • Default params: center=0.7, sigma=1.0
  • "matern32" - Matérn 3/2 kernel: (1 + √3(x + offset)) * exp(-√3(x + offset))
    • Default params: offset=1.7
  • "matern52" - Matérn 5/2 kernel: (1 + √5(x + offset) + (5/3)(x + offset)²) * exp(-√5(x + offset))
    • Default params: offset=1.7
  • "rational" - Rational quadratic kernel: (1 + scale * x)^(-power)
    • Default params: scale=1/3, power=3
  • "gamma_rational" - Gamma rational quadratic kernel: (1 + scale * (x + offset)²)^(-power)
    • Default params: scale=1/3, offset=1.7, power=3
  • LearnableChebyshevKernel - Learnable Chebyshev polynomial kernel with trainable coefficients
    • Example: LearnableChebyshevKernel(degree=4, init_coeffs=[0, 1, 0, 0])

Custom kernels: You can also pass your own callable functions.


## Key Features

- **Window-by-window processing**: Kernels applied correctly within each window
- **Arbitrary kernels**: Use 1, 2, 3, or more kernels
- **No scipy**: Pure NumPy/PyTorch implementation
- **Flexible**: Works for denoising, TFR, feature extraction

## Custom Kernels

You can provide your own kernel functions:
```python
import torch
from ntfa_lk import NTFALayer

# Define custom kernel
def my_custom_kernel(x, scale=2.0, power=3):
    """Custom kernel function."""
    return (x * scale) ** power

# Use with DDKF
layer = NTFALayer(
    kernel_names=[my_custom_kernel, 'gaussian'],  # Mix custom + builtin
    kernel_params=[
        {'scale': 1.5, 'power': 2},  # params for custom kernel
        {'center': 0.5, 'sigma': 0.8}  # params for gaussian
    ],
    gamma=[0.6, 0.4]
)

# Or use lambda functions
layer = NTFALayer(
    kernel_names=[
        lambda x, scale=1.0: torch.exp(-x * scale),
        'polynomial'
    ],
    kernel_params=[
        {'scale': 2.0},
        {'degree': 3, 'offset': 1.0}
    ]
)

Reference

If you use this package or the underlying DDKF technique in your research or software, please cite the original work:

@article{bensegueni2025dual,
  title={Dual Dynamic Kernel Filtering: Accurate Time-Frequency Representation, Reconstruction, and Denoising},
  author={Bensegueni, Skander and Belhaouari, Samir Brahim and Kahalan, Yunis Carreon},
  journal={Digital Signal Processing},
  pages={105407},
  year={2025},
  publisher={Elsevier}
}

License

This project is licensed under the MIT License.

Authors

  • Skander Bensegueni
  • Yunis Kahalan

v4.0.0 - Corrected algorithm, updated parameter names, backpropagatable interpolation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ntfa_lk-1.0.1.tar.gz (13.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ntfa_lk-1.0.1-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file ntfa_lk-1.0.1.tar.gz.

File metadata

  • Download URL: ntfa_lk-1.0.1.tar.gz
  • Upload date:
  • Size: 13.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for ntfa_lk-1.0.1.tar.gz
Algorithm Hash digest
SHA256 a2c6670c523650664b27f2a76cecd4f7e66770886acd9c93626450ad618a877c
MD5 fefd6e614aa940bb4c2131eefd0e7378
BLAKE2b-256 61f9bef48b4255fc99fca8e2948156e6339e8e27ea4cc8cf00721c952b0daaa9

See more details on using hashes here.

File details

Details for the file ntfa_lk-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: ntfa_lk-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for ntfa_lk-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b9455fb37e8afcb00671e3b0707b568238de310e7ad97307cc874fdda0bf0db0
MD5 4b9a8f1c1666b21f700f2ed0587af5e3
BLAKE2b-256 bc278eeac71f47af7dadef7b75fd87b4bf5ecb1b2cd947d3065b2691e0b86666

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page