Skip to main content

An implementation of linear LVMs with a spectral regulariser.

Project description

Spectrally regularised LVMs

GitHub license GitHub last commit PyPI PyPI - Wheel Read the Docs GitHub issues

Current version: 0.1.1

Spectrally-regularised-LVMs is a Python-based package which facilitates the estimation of the linear latent variable model (LVM) parameters with a unique spectral regularisation term in single channel time-series applications.

Purpose

LVMs are a statistical methodology which tries to capture the underlying structure in some observed data. This package caters to single channel time-series applications and provides a methodology to estimate the LVM parameters. The model parameters are encouraged to be diverse via a spectral regularisation which penalises source duplication of the spectral information captured by the latent sources.

The purpose of this package is to provide a complete methodology that caters to a variety of LVM objective functions.

Documentation

Please visit the docs for all supporting documentation for this package.

Installation

The package is designed to be used through the Python API, and can be installed using pip:

$ pip install spectrally-regularised-LVMs

Requirements

This package used Python ≥ 3.10 or later to run. For other python dependencies, please check the pyproject.toml file included in this repository. The dependencies of this package are as follows:

Package Version
Python ≥ 3.10
Numpy ≥ 1.23.1
Matplotlib ≥ 3.5.2
SciPy ≥ 1.8.1
scikit-learn ≥ 1.1.2
tqdm ≥ 4.64.1
SymPy ≥ 1.1.1

API usage

Model parameter estimation

A generic example is shown below:

import spectrally_regularised_LVMs as srLVMs

# Load in some data
signal_data = ... # Load a single channel time-series signal
Fs = ... # Sampling frequency of the data

# Hankelise the data
X = srLVMs.hankel_matrix(signal_data,
                          Lw = 512,
                          Lsft = 1)

# Define a cost function for latent sources with maximum variance
cost_inst = srLVMs.VarianceCost()

# Define a model instance
model_inst = srLVMs.LinearModel(n_sources=10,
                                 cost_instance=cost_inst,
                                 whiten=False,
                                 alpha_reg=1.0)

# Estimate the model parameters
model_inst.fit(X, Fs = Fs)

Cost function implementation

This package allows users to implement their own objective functions. Two examples are shown here.

Method one - user defined

This method allows users to implement their objective function and all required higher order derivatives manually. This is demonstrated through:

import numpy as np
import spectrally_regularised_LVMs as srLVMs

# Define objective function (maximise source variance)
def cost(X, w, y):

    return -1 * np.mean((X @ w) ** 2, axis=0) # Framework performs minimisation

# Define gradient vector
def grad(X, w, y):

    return -2 * np.mean(y * X, axis=0, keepdims=True).T

# Define Hessian matrix
def hess(X, w, y):

    return -2 * np.cov(X, rowvar=False)

# Initialise the cost instance
user_cost = srLVMs.UserCost(use_hessian = True)

# Define the objective function, gradient and Hessian
user_cost.set_cost(cost)
user_cost.set_gradient(grad)
user_cost.set_hessian(hess)

# Check the implementation
X_ = np.random.randn(1000, 16)
w_ = np.random.randn(16, 1)
y_ = X_ @ w_

res_grad = user_cost.check_gradient(X_, w_, y_,step_size = 1e-4)
res_hess = user_cost.check_hessian(X_, w_, y_,step_size = 1e-4)

Method two - SymPy defined

Users can also use SymPy to implement their objective function, which allows for all higher order derivatives to be obtained symbolically. An example of this is given through

import sympy as sp
import numpy as np
import spectrally_regularised_LVMs as srLVMs

n_samples = 1000 # Fix the number of samples in the data
n_features = 16 # Fix the number of features

# Initialise the cost function instance
user_cost = srLVMs.SympyCost(n_samples, n_features, use_hessian=True)

# Get the SymPy representations of the model parameters
X_sp, w_sp, iter_params = user_cost.get_model_parameters()
i, j = iter_params

# Calculate the objective function (maximise source variance)
loss_i = sp.Sum(w_sp[j, 0] * X_sp[i, j], (j, 0, n_features - 1))
loss = -1 / n_samples * sp.Sum(loss_i**2, (i, 0, n_samples - 1))

# Set the properties within the instance
user_cost.set_cost(loss)

# Use SymPy to calculate the first and second order derivatives
user_cost.implement_methods()

# Check the implementation
X_ = np.random.randn(n_samples, n_features)
w_ = np.random.randn(n_features, 1)
y_ = X_ @ w_

res_grad = user_cost.check_gradient(X_, w_, y_,step_size = 1e-4)
res_hess = user_cost.check_hessian(X_, w_, y_,step_size = 1e-4)

Contributing

This package uses Poetry for dependency management and Python packaging and git for version control. To get started, first install git and Poetry and then clone this repository via

$ git clone git@github.com:RyanBalshaw/spectrally-regularised-LVMs.git
$ cd spectrally-constrained-LVMs

Then, install the necessary dependencies in a local environment via

$ poetry install --with dev,docs
$ poetry shell

This will install all necessary package dependencies and activate the virtual environment. You can then set up the pre-commit hooks via

$ pre-commit install
pre-commit installed at .git/hooks/pre-commit

License

This project is licensed under MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spectrally_regularised_lvms-0.1.1.tar.gz (32.5 kB view hashes)

Uploaded Source

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page