Skip to main content

Python fork of pyda (Hewitson et al.) — LTPDA-style signal processing with repository integration.

Project description

ltpda

Python package for LTPDA-style signal processing and LTI system analysis. Fork of pyda-group/pyda, extended for integration with the LTPDA repository stack.


Overview

ltpda provides Python equivalents of the core LTPDA MATLAB toolbox objects: time-series and frequency-series data classes, spectral estimation, pole/zero models, digital filters, and a physical unit algebra. The "it just works" principle of the original MATLAB toolbox is preserved — common analysis tasks require very few lines of code, while the underlying data structures remain fully accessible for advanced use.

The package is in active development. Core signal processing is stable. Features not yet implemented include IIR filter design, plist parameter-list objects, XYZData, and full MATLAB-parity history code reconstruction (hist2py).


Requirements

  • Python 3.10 or later (tested up to 3.13)
  • numpy ≥ 1.18, scipy ≥ 1.5, matplotlib ≥ 3.0, h5py ≥ 3
  • lpsd ≥ 1.0.2 (log-scale PSD estimator — see Installation)

Installation

pip from PyPI

pip install ltpda

Full package listing: https://pypi.org/project/ltpda/

pip from wheel

Download the .whl file from the Releases page, then:

pip install ltpda-<version>-py3-none-any.whl

pip from source

No-clone option (installs directly from the git repository):

pip install git+https://github.com/LordSkippy/LTPDA.git#subdirectory=python

Or clone first:

git clone <this-repo>
cd LTPDA/python
pip install .

All dependencies, including lpsd, are installed automatically.

Developers — Poetry

cd LTPDA/python
poetry install
poetry run pre-commit install   # enable Black, isort, mypy, pylint hooks

lpsd (Apple Silicon only)

lpsd (source: git.physnet.uni-hamburg.de) installs automatically as a listed dependency. The only reason to touch it manually is a performance issue on Apple Silicon (M1/M2): lpsd contains C code that uses long double arithmetic, which on ARM is the same width as double (64-bit). The polyreg step has been observed to dominate runtime. If logpsd is unusably slow, compile lpsd from source with architecture-specific flags:

# from the lpsd source directory
gcc -arch arm64 -c -fPIC ltpda_dft.c
gcc -arch arm64 -shared -o ltpda_dft.so ltpda_dft.o

Check for long double uses throughout if contributing performance fixes for M1.


Repository connectivity

ltpda can connect directly to an LTPDA repository database using the same credential and schema conventions as the MATLAB toolbox — plain MySQL via PyMySQL, no REST API involved. If the MySQL server is on a remote host, establish an SSH tunnel externally and point ltpda at the local forwarded port (see SSH tunnelling below).

Quick connect

from ltpda.repo import LTPDARepository

# Explicit credentials
repo = LTPDARepository('db.host.com', 'my_repo', 'alice', 'mysql_secret')

# Non-standard port (or local SSH tunnel endpoint)
repo = LTPDARepository('localhost', 'my_repo', 'alice', 'mysql_secret', port=3307)

# From environment variables
repo = LTPDARepository.from_env()

# Context manager — connection is always closed on exit
with LTPDARepository('db.host.com', 'my_repo', 'alice', 'mysql_secret') as repo:
    ts = repo.retrieve(42)

Credentials

Two separate credentials are required:

Credential What it is Where to find it
username MySQL username Same as the LTPDA web UI login name
password MySQL password The mysql_password field in the web UI user settings — not the web login password

These are the same credentials the MATLAB toolbox uses to connect via JDBC.

Environment variables

LTPDARepository.from_env() reads connection parameters from five environment variables:

Variable Meaning Required
LTPDA_HOST MySQL hostname — use localhost when an SSH tunnel is running yes
LTPDA_PORT MySQL port — use the local tunnel port when tunnelling no (default 3306)
LTPDA_DB Database (schema) name yes
LTPDA_USER MySQL username (web UI login name) yes
LTPDA_PASS MySQL password (mysql_password from the web UI) yes

If any required variable is missing, from_env() raises EnvironmentError listing every absent variable.

Setting env vars — Option A: shell rc file (Linux / macOS / Git Bash)

Add to ~/.bashrc, ~/.zshrc, or ~/.bash_profile and then reload:

export LTPDA_HOST=db.host.com
export LTPDA_DB=my_repo
export LTPDA_USER=alice
export LTPDA_PASS=mysql_secret   # the mysql_password from the web UI
# optional:
export LTPDA_PORT=3306

source ~/.bashrc   # or open a new terminal

Setting env vars — Option B: .env file + python-dotenv (recommended for notebooks)

Create a .env file next to your notebook or script (add it to .gitignore; never commit credentials):

# .env — do NOT commit this file
LTPDA_HOST=db.host.com
LTPDA_DB=my_repo
LTPDA_USER=alice
LTPDA_PASS=mysql_secret

Then load it before calling from_env():

from dotenv import load_dotenv   # pip install python-dotenv
load_dotenv()

from ltpda.repo import LTPDARepository
repo = LTPDARepository.from_env()

Setting env vars — Option C: notebook cell (quick interactive use)

import os
os.environ['LTPDA_HOST'] = 'db.host.com'
os.environ['LTPDA_DB']   = 'my_repo'
os.environ['LTPDA_USER'] = 'alice'
os.environ['LTPDA_PASS'] = 'mysql_secret'

from ltpda.repo import LTPDARepository
repo = LTPDARepository.from_env()

Credentials in saved notebooks can be exposed unintentionally — passing them directly to the constructor is often safer:

repo = LTPDARepository('db.host.com', 'my_repo', 'alice', 'mysql_secret')

Setting env vars — Option D: Windows PowerShell (permanent user variable)

[System.Environment]::SetEnvironmentVariable('LTPDA_HOST', 'db.host.com', 'User')
[System.Environment]::SetEnvironmentVariable('LTPDA_DB',   'my_repo',     'User')
[System.Environment]::SetEnvironmentVariable('LTPDA_USER', 'alice',       'User')
[System.Environment]::SetEnvironmentVariable('LTPDA_PASS', 'mysql_secret','User')

Restart any terminals or IDE sessions after setting them.

Setting env vars — Option E: CI/CD (GitHub Actions / GitLab CI)

Store secrets in the platform secret store, not in the repository:

# GitHub Actions example
env:
  LTPDA_HOST: ${{ secrets.LTPDA_HOST }}
  LTPDA_DB:   ${{ secrets.LTPDA_DB }}
  LTPDA_USER: ${{ secrets.LTPDA_USER }}
  LTPDA_PASS: ${{ secrets.LTPDA_PASS }}

SSH tunnelling

ltpda does not manage SSH tunnels. If MySQL is behind a firewall, create the tunnel externally before connecting:

# Forward local port 3307 → MySQL on db.internal:3306 via gateway.host
ssh -L 3307:db.internal:3306 gateway.host -N &

Then pass hostname='localhost', port=3307 to ltpda:

repo = LTPDARepository('localhost', 'my_repo', 'alice', 'mysql_secret', port=3307)

Submit

from datetime import datetime
from ltpda.tsdata import TSData
from ltpda.repo import LTPDARepository

ts = TSData.randn(nsecs=3600, fs=10, name='ACC_X', yunits='m/s^2')
ts.t0 = datetime(2024, 1, 15, 0, 0)   # absolute UTC start time

with LTPDARepository('db.host.com', 'my_repo', 'alice', 'secret') as repo:
    result = repo.submit(
        ts,
        experiment_title='Noise floor run',
        experiment_desc='Accelerometer noise at rest on optical bench',
        analysis_desc='No processing applied — raw data submission',
        quantity='acceleration',
        keywords='noise, accelerometer',
    )
    print(result.id, result.uuid)   # assigned DB id and UUID

Submit multiple objects in one transaction — a collection is created automatically:

results = repo.submit(
    ts1, ts2, ts3,
    experiment_title='Three-axis measurement',
    experiment_desc='Simultaneous X Y Z accelerometer data',
    analysis_desc='Raw data, no filtering',
)
print(results[0].cid)   # collection ID shared by all three

Mandatory field minimum lengths (mirrors the MATLAB web UI validation):

Field Minimum
experiment_title 5 characters
experiment_desc 10 characters
analysis_desc 10 characters

Retrieve by ID

# Single object
ts = repo.retrieve(42)

# Multiple objects
ts1, ts2 = repo.retrieve(42, 43)

# All objects in a collection
objects = repo.retrieve(cid=7)

Retrieve path selection (cross-compatibility)

ltpda auto-detects the submission origin from the objmeta.version field and chooses the optimal deserialisation path automatically:

Object origin Path chosen
Python (ltpda) HDF5 binary in bobjs.mat → fallback to XML if missing
MATLAB XML in objs.xml directly (MATLAB .mat binary is not parseable by Python)

No user action is required — repo.retrieve(id) always returns the correct ltpda object regardless of which tool submitted it.

Time-range retrieval

Retrieve all time-series segments overlapping a window, then concatenate and crop:

from datetime import datetime

ts = repo.get(
    'ACC_X',                           # SQL LIKE pattern on object name
    t0=datetime(2024, 1, 15, 0, 0),
    t1=datetime(2024, 1, 15, 6, 0),
    author='alice',                    # optional author filter
)
# ts is a single TSData spanning exactly [t0, t1]

This is the primary method for retrieving long continuous time-series stored as multiple shorter segments — the same pattern the MATLAB toolbox uses internally.

Search

# All objects
results = repo.find()

# By name pattern
results = repo.find(name='ACC%')

# By author and time range
results = repo.find(
    name='ACC_X',
    author='alice',
    date_from='2024-01-01',
    date_to='2024-02-01',
)

# Objects whose stored timespan overlaps [t0, t1]
from datetime import datetime
results = repo.find(
    name='ACC_X',
    timespan=(datetime(2024, 1, 15), datetime(2024, 1, 16)),
)

for r in results:
    print(r.id, r.name, r.submitted, r.t_start, r.t_stop)

Utility methods

# Full metadata (no binary download)
metas = repo.get_metadata(42, 43, 44)
print(metas[0].fs, metas[0].nsecs, metas[0].t0)

# Most recent segment for a channel
latest = repo.get_latest('ACC_X')

# ID ↔ UUID
uuid  = repo.get_uuid(42)
obj_id = repo.get_id(uuid)

# Collection membership
ids = repo.get_collection_ids(7)

# Duplicate detection
dupes = repo.find_duplicates()   # list of (id, uuid) pairs

# CSV report
repo.report('repo_dump.csv', date_from='2024-01-01')

# All accessible schemas
databases = repo.list_databases()

# Without opening a persistent connection
databases = LTPDARepository.available_databases('db.host.com', 'alice', 'secret')

Format compatibility

ltpda-submitted objects store their binary payload as HDF5 (.ltpda format) in the bobjs table. The objs.xml field is set to the sentinel value binary_hdf5.

  • MATLAB: can see ltpda-submitted objects in the web UI (all metadata fields are populated). MATLAB cannot reconstruct the data itself (binary_hdf5 is not a valid LTPDA XML serialization).
  • ltpda: can fully retrieve both ltpda-submitted HDF5 objects and MATLAB-submitted binary .mat objects (best-effort via scipy.io.loadmat; complex LTPDA class hierarchies may not parse correctly). XML-only MATLAB objects (binary=False) raise NotImplementedError.

Quick start

from ltpda.tsdata import TSData
from ltpda.dsp.spectral import psd, asd

# 10000 s of white noise at 10 Hz
ts = TSData.randn(nsecs=10000, fs=10, name='noise', yunits='m')

# Power and amplitude spectral density
Pxx = psd(ts, navs=10, window='BH92')
Sxx = asd(ts, navs=10, window='BH92')
Sxx.loglog()

Documentation

Creating data objects

TSData — time-series with a sampling rate. The time axis is auto-generated from fs.

from ltpda.tsdata import TSData

# White noise
ts = TSData.randn(nsecs=1000, fs=100, name='noise', yunits='V')

# Sine wave
s = TSData.sinewave(fs=100, nsecs=10, A0=2.0, f0=1.2, phi=0, name='sine', yunits='V')

# Zeros
z = TSData.zeros(nsecs=100, fs=10, yunits='m')

XYData / FSData — general 2-D data and frequency-series.

from ltpda.xydata import XYData
from ltpda.fsdata import FSData
import numpy as np

xy = XYData(xaxis=np.linspace(0, 1, 100), yaxis=np.random.randn(100),
            xname='Time', yname='Signal', xunits='s', yunits='V')

# FSData from a parametric function (e.g. noise model)
fs_noise = FSData.from_function(
    lambda f: 13.5e-12**2 * (1 + (2e-3 / f)**4),
    xmin=1e-4, xmax=1, npts=1000, xunits='Hz', yunits='m^2/Hz',
)

YData — scalar data with units (no x axis).

from ltpda.ydata import YData

y = YData(yaxis=3.14, yunits='m')

Arithmetic and error propagation

All arithmetic operations are supported between data objects and between objects and scalars / numpy arrays. Gaussian errors in ddata are propagated automatically.

ts  = TSData.randn(nsecs=100, fs=10, yunits='V')
ts2 = TSData.randn(nsecs=100, fs=10, yunits='V')

result = ts + ts2          # addition
ratio  = ts / ts2          # division — units cancel, errors propagate
power  = ts ** 3           # power — errors propagated via chain rule
scaled = ts * 10           # scalar multiply

# Attach per-sample errors and plot with error bars
ts.dyaxis = 0.1            # uniform error (shorthand for ts.yaxis.ddata = 0.1)
ts.plot(ShowErrors=True, ErrorType='area')

Units are tracked through every operation:

from ltpda.utils.unit import Unit

u = Unit('m^2 Hz^-1')
print(u.char())        # [m^(2)][Hz^(-1)]
print(u.sqrt().char()) # [m][Hz^(-1/2)]

Spectral analysis

All estimators accept a TSData input and return FSData.

from ltpda.dsp.spectral import psd, asd, csd, mscohere, cohere, tfe, logpsd

# Welch PSD / ASD
Pxx = psd(ts, navs=10, window='BH92')          # Power spectral density
Sxx = asd(ts, navs=10, window='BH92')          # Amplitude spectral density (= √PSD)

# Scale options: 'PSD' (default), 'ASD', 'PS', 'AS'
Sxx2 = psd(ts, navs=10, window='BH92', scale='ASD')

# Cross-spectral quantities (two inputs)
Pxy  = csd(ts1, ts2, navs=10, window='BH92')   # Cross-spectral density
coh  = mscohere(ts1, ts2, navs=10)             # Magnitude-squared coherence
ccoh = cohere(ts1, ts2, navs=10)               # Complex coherence
H    = tfe(ts1, ts2, navs=10, window='BH92')   # Transfer function estimate

# Log-scale PSD (requires lpsd package)
lPxx = logpsd(ts)
lPxx2 = logpsd(ts, order=3)                    # higher-order debiasing

# Plot with errors
lPxx.sqrt().loglog(lPxx2.sqrt(), ShowErrors=True, ErrorType='area')

Nfft vs navs: Pass Nfft to set the segment length in samples, or navs to set the target number of averages. Both control the frequency resolution / variance trade-off.


Spectral windows

from ltpda.utils.specwin import Specwin

# List all available windows
Specwin.supportedWindows()
# ['Rectangular', 'Welch', 'Bartlett', 'Hanning', 'Hamming',
#  'Nuttall3', 'Nuttall4', 'Nuttall3a', 'Nuttall3b', 'Nuttall4a',
#  'Nuttall4b', 'Nuttall4c', 'BH92', 'SFT3F', 'SFT3M', 'FTNI',
#  'SFT4F', 'SFT5F', 'SFT4M', 'FTHP', 'HFT70', 'FTSRS', 'SFT5M',
#  'HFT90D', 'HFT95', 'HFT116D', 'HFT144D', 'HFT169D', 'HFT196D',
#  'HFT223D', 'HFT248D', 'Kaiser']

# Inspect a window
w = Specwin('BH92', N=1024)
print(w.nenbw)   # Normalised equivalent noise bandwidth
print(w.psll)    # Peak sidelobe level (dB)

# Kaiser window: specify sidelobe level
w_k = Specwin('Kaiser', N=1024, psll=200)

Pass the window name as a string to any spectral estimator: window='BH92'.


Pole/zero models

from ltpda.pzmodel import PZModel, PZ
import numpy as np

# PZ objects: real pole (f only) or complex pair (f + Q)
p1 = PZ(0.01, Q=2)   # complex pair at 0.01 Hz, Q=2
p2 = PZ(3)           # real pole at 3 Hz
z1 = PZ(0.1)         # real zero at 0.1 Hz
z2 = PZ(0.2)         # real zero at 0.2 Hz

pzm = PZModel(poles=[p1, p2], zeros=[z1, z2], gain=2, delay=0,
              iunits='m', ounits='V')
print(pzm)

# Evaluate frequency response → returns FSData
freqs = np.logspace(-3, 1, 500)
r = pzm.resp(freqs=freqs)
r.abs().loglog()

Noise generation

Generate a time-series with a spectral shape defined by a PZModel using the Franklin algorithm. The result can be arbitrarily long; state is maintained across calls.

from ltpda.dsp.noisegen import NoiseGen
from ltpda.dsp.spectral import logpsd

ng = NoiseGen(pzm=pzm, fs=30)
ts = ng.generateNoise(nsecs=1e5)

# Verify: compare generated spectrum against model response
S  = logpsd(ts)
r  = pzm.resp(freqs=S.xaxis.data)
S.sqrt().loglog(r.abs())

FIR digital filters

from ltpda.dsp.filter import FIR

# Design filters (scipy windowed-sinc method)
lp = FIR.lowpass( fc=1,          gain=1, fs=10, order=32,   win='blackmanharris',
                  iunits='V', ounits='m')
hp = FIR.highpass(fc=1,          gain=1, fs=10, order=32,   win='blackmanharris',
                  iunits='V', ounits='m')
bp = FIR.bandpass(fc=[0.01, 0.1], gain=2, fs=10, order=1024, iunits='V', ounits='m')
bs = FIR.bandstop(fc=[0.01, 0.1], gain=2, fs=10, order=1024, iunits='V', ounits='m')

# Frequency response → FSData
r = lp.resp(f1=0.1, f2=5, nf=1000)
r.loglog()

# Apply to a time-series
ts_filtered = lp.filter(ts)

Differentiation

Five numerical differentiation methods are available via TSData.diff().

s = TSData.sinewave(fs=100, nsecs=10, A0=1, f0=1.2, phi=0, yunits='V')

# method: 'diff', '2point', '3point', '5point', 'order2', 'order2Smooth'
# order:  'Zero' (smoothed input), 'First' (first derivative), 'Second'
ds1 = s.diff(method='3point', order='First')
ds2 = s.diff(method='5point', order='Second')

# order2 and order2Smooth do not take an 'order' argument
ds_o2 = s.diff(method='order2Smooth')

s.plot(ds1, ds2)
Method Notes
diff Simple numpy finite difference
2point Two-point stencil
3point Three-point centered stencil
5point Five-point centered stencil, higher accuracy
order2 Polynomial fitting on irregular grids
order2Smooth order2 with 5-point smoothing pass

Splitting data

# Time-series: split by time window [start, stop] in seconds
segment = ts.split_by_time(times=[10, 60])

# Frequency-series: split by frequency range [f_low, f_high] in Hz
band = Sxx.split_by_frequency(freqs=[0.1, 1.0])

Known bug (#5): split_by_time currently uses time values as sample indices rather than comparing against the actual time axis. Use with care on data that does not start at t = 0.


File I/O

Objects are serialised to HDF5 with a versioned format. The file extension is .ltpda.

# Save
ts.save('my_timeseries.ltpda')

# Load
from ltpda.tsdata import TSData
ts2 = TSData.load('my_timeseries.ltpda')

# Load from text file
ts3 = TSData.from_txt_file('data.txt', fs=100, yunits='V',
                            xcol=0, ycol=1, delimiter=',')

Core classes

Data hierarchy

YData                   Y-axis data with units and Gaussian error propagation
  └── XYData            adds an X axis (general 2-D data)
        ├── TSData      time-series — sampling-rate aware; auto-generates time axis
        └── FSData      frequency-series — X units default to Hz

Supporting classes

Class Purpose
Axis Wraps a numpy array with a Unit, error array (ddata), and a name
Unit Symbolic unit algebra — parse, multiply, simplify, convert to SI
Specwin 30+ spectral window functions
PZ Single pole or zero in f/Q or complex (s-plane) representation
PZModel Poles, zeros, gain, and delay — evaluates to FSData via .resp()
DFilter / FIR Digital filter classes with .resp() and .filter()
NoiseGen Franklin-algorithm colored-noise generator driven by a PZModel

Features

  • Time and frequency seriesTSData and FSData with unit tracking, error propagation, and HDF5 serialisation (.ltpda files, versioned format)
  • Physical unit algebra — parses unit strings ("m/s^2", "pm^1.5", …), multiplies, simplifies, converts to SI, and produces LaTeX axis labels
  • Error propagation — Gaussian errors tracked through every arithmetic operation including +, -, *, /, **, abs, sqrt, log10, exp
  • Spectral estimation — Welch WOSA: psd, asd, csd, mscohere, cohere, tfe; log-scale logpsd via the external lpsd library; PSD / ASD / PS / AS output scaling
  • Spectral windows — 30+ types; each exposes NENBW, PSLL, and 3 dB bandwidth properties
  • Pole/zero modelsPZModel with frequency-response evaluation; automatic f/Q ↔ complex root conversion; complex-conjugate pole pairs handled correctly
  • FIR digital filters — lowpass, highpass, bandpass, bandstop; frequency response and time-domain filtering of TSData
  • Noise generation — Franklin algorithm; arbitrary spectral shape prescribed by a PZModel; state maintained across calls for arbitrarily long sequences
  • Differentiation — five methods: 2-point, 3-point, 5-point, order-2 polynomial fit, and order-2 with 5-point smoothing; orders Zero, First, Second
  • Resampling and fractional delay — windowed-sinc interpolation with Blackman window
  • Plottingplot, loglog, semilogy, semilogx; complex data automatically splits into magnitude and phase panels; error bars with ShowErrors=True, ErrorType='area'
  • File I/Osave() / load() on all data objects; from_txt_file() and from_complex_txt_file() class-method constructors

Not yet implemented

  • IIR filters (MATLAB miir)
  • plist parameter-list objects (currently plain Python keyword arguments)
  • XYZData class with spectrogram support
  • Additional math operators on XYData: sin, cos, tan and friends
  • Log-scale spectral estimators: ltfe, lcohere, and equivalents of the remaining LTPDA lpsd family
  • fpsder — fractional polynomial derivative (started, not finished)
  • Vectorised spectral functions — psd(*ts_list) / asd(*ts_list) to operate on multiple objects at once
  • Axis-level method helper — a generic wrapper to apply arbitrary functions to an Axis with correct error propagation
  • Time-domain simulation / step response for PZModel
  • Calibration objects and control-system design utilities
  • Docstrings — help text coverage is incomplete throughout the package

Directory layout

python/
├── ltpda/
│   ├── ydata.py          YData base class
│   ├── xydata.py         XYData (general 2-D data)
│   ├── tsdata.py         TSData (time-series, with absolute t0 support)
│   ├── fsdata.py         FSData (frequency-series)
│   ├── pzmodel.py        PZModel + PZ (pole/zero transfer functions)
│   ├── functions.py      Module-level function wrappers
│   ├── repo/             Repository connectivity (direct MySQL / PyMySQL)
│   │   ├── __init__.py   Exports LTPDARepository
│   │   ├── client.py     LTPDARepository — main public API class
│   │   ├── models.py     SubmitResult, ObjectMeta, SearchResult dataclasses
│   │   ├── _connection.py  MySQL connection wrapper (RepoConnection)
│   │   ├── _submit.py    Submit logic (mirrors MATLAB submit.m)
│   │   ├── _retrieve.py  Retrieve / time-range / HDF5 deserialization
│   │   └── _search.py    Search, find, metadata, report utilities
│   ├── utils/
│   │   ├── axis.py       Axis — numpy array with units and errors
│   │   ├── unit.py       Unit — symbolic algebra and SI conversion
│   │   ├── specwin.py    Spectral windows (30+ types)
│   │   └── math/         Helper math utilities (rat, intfact, normal_round)
│   ├── dsp/
│   │   ├── filter.py     TF, DFilter, FIR digital filter classes
│   │   ├── spectral.py   PSD, ASD, CSD, coherence, TFE estimators
│   │   └── noisegen.py   Franklin noise generator
│   ├── mixins/           Composable mixins (operators, plotting, diff, DSP)
│   └── Examples/         Jupyter notebooks (submit, retrieve, time-range examples)
├── docker/               Dockerfile for CI / containerised testing
└── tests/                pytest test suite (~54% coverage)

Development

Run the tests

make test
# or
poetry run pytest

All tests must pass and coverage must not drop below 54 %.

Docker

A docker/Dockerfile builds a self-contained Python environment with ltpda installed (Python 3.10 by default, also tested against 3.7). The Makefile provides helpers:

make docker         # build gwdiexp/ltpda:develop (and :develop-3.10)
make docker-push    # push both tags to Docker Hub
make test-docker    # run the test suite inside the container

The Docker image is primarily used for CI. To run tests in the container locally:

docker run -v $(pwd):/code --rm -it gwdiexp/ltpda:develop make test

Code style

Black (88-character lines), isort, pylint, and mypy are enforced via pre-commit. The hooks run automatically before each commit once enabled:

poetry run pre-commit install

Release a new version

poetry version patch   # bug fixes
poetry version minor   # new features
poetry version major   # breaking changes

Then merge to main.

Open design questions

These architectural decisions are unresolved and worth settling before the relevant areas grow further:

  • Plotter separation — plotting methods (plot, loglog, …) currently live as mixins on the data classes. An alternative is a standalone TSPlotter / FSPlotter class: tsplt.loglog(ts1, ts2, ts3). This would decouple visualisation from data and make the classes easier to test.

  • Spectral and filter mixinspsd, asd, tfe, and filter application currently live in separate modules. Since they only operate on TSData, mixing them directly onto TSData (like TSDataDSP) would give ts.psd(navs=10) call syntax. Trade-off: convenience vs separation of concerns.

  • Setter validation in Axis — input checking for data, ddata, and units is spread across the data classes. Moving it into Axis.__set__ would centralise validation and make subclassing safer.


Known issues

The following open issues are tracked upstream at gitlab.com/pyda-group/pyda/-/issues.

Bugs:

  • #6ydata / ydata raises WrongSizeException Division between two XYData / YData objects fails due to a unit exponent list length mismatch. Workaround: divide the underlying numpy arrays directly.

  • #5split_by_time uses indices instead of time values Start/stop times are multiplied by fs and used as sample indices rather than compared against the actual time axis. Results are incorrect for data that does not start at t = 0.

  • #23numpy.array * YData calls YData.__mul__ element-wise When a numpy array is the left operand, Python dispatches multiplication to YData.__mul__ repeatedly rather than treating the array as a single operand. Operator test coverage is incomplete.

Design limitations:

  • #11 — No vectorised operations on lists of objects There is no array-of-objects type. Calling .plot() on a Python list of TSData objects requires my_list[0].plot(*my_list[1:]) as a workaround.

  • Processing history (partial implementation) — Every ltpda object carries a .history attribute that records Python-side operations (constructor, arithmetic, DSP, repo retrieve/submit). When retrieving a MATLAB-submitted object, the full LTPDA history chain is parsed from <historyRoot> XML and accessible as obj.history. Known limitations:

    • hist2py() (code reconstruction from history) is not implemented.
    • When MATLAB retrieves a ltpda-processed object, Python steps appear in MATLAB's history browser but hist2m() cannot reconstruct them — it produces comments for Python nodes.
    • History is not stored in the ltpda HDF5 format; it lives only in the XML serialisation (objs.xml). Objects retrieved via the HDF5 path (ltpda-submitted) carry only their Python-side history, not any MATLAB chain.
    from ltpda.history import display as show_history
    
    # Python-tracked history
    ts = TSData.randn(nsecs=100, fs=10)
    ts2 = ts * 2.0
    show_history(ts2.history)
    # [py] mul  2024-01-15 00:01:02
    #   [py] TSData.constructor  2024-01-15 00:01:00
    
    # MATLAB history (after repo.retrieve on a MATLAB-submitted ao)
    lpsd_obj = repo.retrieve(42)
    show_history(lpsd_obj.history)
    # [py] repo.retrieve  2024-01-15 00:02:00
    #   [ml] lpsd  2024-01-15 00:00:58
    #     [ml] plus  2024-01-14 23:59:50
    

Enhancements under discussion:

  • #9 — Replace ddata with the uncertainties library Proposal to use uncertainties.uarray instead of separate data/error arrays for more transparent error propagation.

  • #8 — Object __str__ should show data values print(ts) currently shows shape only. Request to show first/last values following the numpy convention.

  • #7 — Mixed-unit plots should warn Plotting objects with incompatible units silently produces a misleading axis label. Request to display [Mixed] or raise a warning.

  • #3 — Package name is taken on PyPI (resolved — package renamed to ) The name was already registered on PyPI by an unrelated project. Resolved by renaming this package to .


Version history

0.2.2

  • First pypi.org release

0.2.1

  • Renamed package from pyda to ltpda to resolve PyPI naming conflict (issue #3). File extension .pyda.ltpda (.pyda files still load for backward compatibility). Repository sentinel binary_pydabinary_hdf5.
  • Dependency updates for NumPy 2.x compatibility: numpy uncapped (≥ 1.18), matplotlib ≥ 3.9, h5py ≥ 3.10. Added mpmath ≥ 1.0 as a runtime dependency.
  • Wired up ltpda.dsp.NoiseGen (Franklin noise generator): added missing mpmath dependency, exported from ltpda.dsp, added smoke tests.
  • Bug fixes:
    • PZ() no-argument constructor crashed with TypeError because numpy.isreal(None) is True, causing fq2ri(f0=None) to be called. Guarded dispatch block with if f is not None.
    • TSData.nsecs() and TSData.fs() raised ValueError / emitted numpy warnings on empty time-series objects. Both now return 0.0 early when xdata() is empty.
    • Axis.ddata setter size check was gated on numpy.shape(ddata)[0] > 2 (first dimension, not total size), allowing mismatched error vectors to be silently accepted. Replaced with ddata.size > 1.
  • Test suite: removed three stale @unittest.skip decorators (bugs resolved). Excluded ltpda/repo/* from coverage measurement (requires live MySQL). Coverage threshold met at 56%.

0.2.0

  • Repository connectivity: MySQL backend, submit/retrieve AO objects, search interface.
  • History tracking: record and replay analysis steps; XML exchange with MATLAB LTPDA.

Upstream baseline (pyda, pre-fork)

The following was already present in pyda-group/pyda before this fork was created, written by Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding:

  • YData, XYData, TSData, FSData — core data classes with error propagation
  • Unit — physical unit algebra (parse, simplify, convert to SI)
  • PZModel / PZ — pole/zero model representation and response computation
  • SpecWin — spectral window functions (Hann, flat-top, Kaiser-Bessel, …)
  • dsp.spectral — PSD / ASD estimation via lpsd
  • dsp.filter — digital filter representation
  • dsp.noisegen — Franklin colored-noise generator (wired up in 0.2.1)
  • HDF5 save/load for all data classes
  • Operator overloading (+, -, *, /, **, comparison) with unit checking

Heritage

ltpda was created by Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding as a Python reimplementation of the LTPDA MATLAB toolbox for gravitational-wave and precision-measurement data analysis. The upstream project is maintained at gitlab.com/pyda-group/pyda.

This fork extends the upstream work for integration with the LTPDA repository stack.

Original authors:


Disclaimer

This software is provided "as is", without warranty of any kind, express or implied. Use at your own risk. The authors make no guarantees about correctness, fitness for a particular purpose, or continued development. See LICENSE.md for full terms.


License

Upstream pyda copyright 2022 Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding. See Heritage.

Modifications and extensions in this fork: Copyright 2026 Simon Barke.

Licensed under the Apache License, Version 2.0. See LICENSE.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ltpda-0.2.2.tar.gz (6.0 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ltpda-0.2.2-py3-none-any.whl (6.0 MB view details)

Uploaded Python 3

File details

Details for the file ltpda-0.2.2.tar.gz.

File metadata

  • Download URL: ltpda-0.2.2.tar.gz
  • Upload date:
  • Size: 6.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.4 CPython/3.10.11 Windows/10

File hashes

Hashes for ltpda-0.2.2.tar.gz
Algorithm Hash digest
SHA256 5fb545921338865ecafa77a3472f46126504bd8d54303ab62933db0aded3dc02
MD5 f38b305716e1bc789476667b85088628
BLAKE2b-256 cbf8761be7191bb2b0e6b493f50ea5976d01730ba8767eec9db264816c6a15a0

See more details on using hashes here.

File details

Details for the file ltpda-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: ltpda-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 6.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.4 CPython/3.10.11 Windows/10

File hashes

Hashes for ltpda-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5c05a73f6642da03c61fbc7c83933bf3e405c30475f7bbd38745d76c64815a84
MD5 7a3d1e63198d6813dac56801c508ba77
BLAKE2b-256 e7e359b62b850bd092494bada06686cb5087cfc8f23d80bde0d3b3943be72e8b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page