Python fork of pyda (Hewitson et al.) — LTPDA-style signal processing with repository integration.
Project description
ltpda
Python package for LTPDA-style signal processing and LTI system analysis. Fork of pyda-group/pyda, extended for integration with the LTPDA repository stack.
Overview
ltpda provides Python equivalents of the core LTPDA MATLAB toolbox objects: time-series and frequency-series data classes, spectral estimation, pole/zero models, digital filters, and a physical unit algebra. The "it just works" principle of the original MATLAB toolbox is preserved — common analysis tasks require very few lines of code, while the underlying data structures remain fully accessible for advanced use.
The package is in active development. Core signal processing is stable. Features not yet
implemented include IIR filter design, plist parameter-list objects, XYZData, and full
MATLAB-parity history code reconstruction (hist2py).
Requirements
- Python 3.10 or later (tested up to 3.13)
- numpy ≥ 1.18, scipy ≥ 1.5, matplotlib ≥ 3.0, h5py ≥ 3
- lpsd ≥ 1.0.2 (log-scale PSD estimator — see Installation)
Installation
pip from PyPI
pip install ltpda
Full package listing: https://pypi.org/project/ltpda/
pip from wheel
Download the .whl file from the Releases page, then:
pip install ltpda-<version>-py3-none-any.whl
pip from source
No-clone option (installs directly from the git repository):
pip install git+https://github.com/LordSkippy/LTPDA.git#subdirectory=python
Or clone first:
git clone <this-repo>
cd LTPDA/python
pip install .
All dependencies, including lpsd, are installed automatically.
Developers — Poetry
cd LTPDA/python
poetry install
poetry run pre-commit install # enable Black, isort, mypy, pylint hooks
lpsd (Apple Silicon only)
lpsd (source: git.physnet.uni-hamburg.de) installs automatically as a listed dependency.
The only reason to touch it manually is a performance issue on Apple Silicon (M1/M2): lpsd contains C code that uses
long double arithmetic, which on ARM is the same width as double (64-bit). The
polyreg step has been observed to dominate runtime. If logpsd is unusably slow,
compile lpsd from source with architecture-specific flags:
# from the lpsd source directory
gcc -arch arm64 -c -fPIC ltpda_dft.c
gcc -arch arm64 -shared -o ltpda_dft.so ltpda_dft.o
Check for long double uses throughout if contributing performance fixes for M1.
Repository connectivity
ltpda can connect directly to an LTPDA repository database using the same credential and schema conventions as the MATLAB toolbox — plain MySQL via PyMySQL, no REST API involved. If the MySQL server is on a remote host, establish an SSH tunnel externally and point ltpda at the local forwarded port (see SSH tunnelling below).
Quick connect
from ltpda.repo import LTPDARepository
# Explicit credentials
repo = LTPDARepository('db.host.com', 'my_repo', 'alice', 'mysql_secret')
# Non-standard port (or local SSH tunnel endpoint)
repo = LTPDARepository('localhost', 'my_repo', 'alice', 'mysql_secret', port=3307)
# From environment variables
repo = LTPDARepository.from_env()
# Context manager — connection is always closed on exit
with LTPDARepository('db.host.com', 'my_repo', 'alice', 'mysql_secret') as repo:
ts = repo.retrieve(42)
Credentials
Two separate credentials are required:
| Credential | What it is | Where to find it |
|---|---|---|
| username | MySQL username | Same as the LTPDA web UI login name |
| password | MySQL password | The mysql_password field in the web UI user settings — not the web login password |
These are the same credentials the MATLAB toolbox uses to connect via JDBC.
Environment variables
LTPDARepository.from_env() reads connection parameters from five environment variables:
| Variable | Meaning | Required |
|---|---|---|
LTPDA_HOST |
MySQL hostname — use localhost when an SSH tunnel is running |
yes |
LTPDA_PORT |
MySQL port — use the local tunnel port when tunnelling | no (default 3306) |
LTPDA_DB |
Database (schema) name | yes |
LTPDA_USER |
MySQL username (web UI login name) | yes |
LTPDA_PASS |
MySQL password (mysql_password from the web UI) |
yes |
If any required variable is missing, from_env() raises EnvironmentError listing every
absent variable.
Setting env vars — Option A: shell rc file (Linux / macOS / Git Bash)
Add to ~/.bashrc, ~/.zshrc, or ~/.bash_profile and then reload:
export LTPDA_HOST=db.host.com
export LTPDA_DB=my_repo
export LTPDA_USER=alice
export LTPDA_PASS=mysql_secret # the mysql_password from the web UI
# optional:
export LTPDA_PORT=3306
source ~/.bashrc # or open a new terminal
Setting env vars — Option B: .env file + python-dotenv (recommended for notebooks)
Create a .env file next to your notebook or script (add it to .gitignore; never commit credentials):
# .env — do NOT commit this file
LTPDA_HOST=db.host.com
LTPDA_DB=my_repo
LTPDA_USER=alice
LTPDA_PASS=mysql_secret
Then load it before calling from_env():
from dotenv import load_dotenv # pip install python-dotenv
load_dotenv()
from ltpda.repo import LTPDARepository
repo = LTPDARepository.from_env()
Setting env vars — Option C: notebook cell (quick interactive use)
import os
os.environ['LTPDA_HOST'] = 'db.host.com'
os.environ['LTPDA_DB'] = 'my_repo'
os.environ['LTPDA_USER'] = 'alice'
os.environ['LTPDA_PASS'] = 'mysql_secret'
from ltpda.repo import LTPDARepository
repo = LTPDARepository.from_env()
Credentials in saved notebooks can be exposed unintentionally — passing them directly to the constructor is often safer:
repo = LTPDARepository('db.host.com', 'my_repo', 'alice', 'mysql_secret')
Setting env vars — Option D: Windows PowerShell (permanent user variable)
[System.Environment]::SetEnvironmentVariable('LTPDA_HOST', 'db.host.com', 'User')
[System.Environment]::SetEnvironmentVariable('LTPDA_DB', 'my_repo', 'User')
[System.Environment]::SetEnvironmentVariable('LTPDA_USER', 'alice', 'User')
[System.Environment]::SetEnvironmentVariable('LTPDA_PASS', 'mysql_secret','User')
Restart any terminals or IDE sessions after setting them.
Setting env vars — Option E: CI/CD (GitHub Actions / GitLab CI)
Store secrets in the platform secret store, not in the repository:
# GitHub Actions example
env:
LTPDA_HOST: ${{ secrets.LTPDA_HOST }}
LTPDA_DB: ${{ secrets.LTPDA_DB }}
LTPDA_USER: ${{ secrets.LTPDA_USER }}
LTPDA_PASS: ${{ secrets.LTPDA_PASS }}
SSH tunnelling
ltpda does not manage SSH tunnels. If MySQL is behind a firewall, create the tunnel externally before connecting:
# Forward local port 3307 → MySQL on db.internal:3306 via gateway.host
ssh -L 3307:db.internal:3306 gateway.host -N &
Then pass hostname='localhost', port=3307 to ltpda:
repo = LTPDARepository('localhost', 'my_repo', 'alice', 'mysql_secret', port=3307)
Submit
from datetime import datetime
from ltpda.tsdata import TSData
from ltpda.repo import LTPDARepository
ts = TSData.randn(nsecs=3600, fs=10, name='ACC_X', yunits='m/s^2')
ts.t0 = datetime(2024, 1, 15, 0, 0) # absolute UTC start time
with LTPDARepository('db.host.com', 'my_repo', 'alice', 'secret') as repo:
result = repo.submit(
ts,
experiment_title='Noise floor run',
experiment_desc='Accelerometer noise at rest on optical bench',
analysis_desc='No processing applied — raw data submission',
quantity='acceleration',
keywords='noise, accelerometer',
)
print(result.id, result.uuid) # assigned DB id and UUID
Submit multiple objects in one transaction — a collection is created automatically:
results = repo.submit(
ts1, ts2, ts3,
experiment_title='Three-axis measurement',
experiment_desc='Simultaneous X Y Z accelerometer data',
analysis_desc='Raw data, no filtering',
)
print(results[0].cid) # collection ID shared by all three
Mandatory field minimum lengths (mirrors the MATLAB web UI validation):
| Field | Minimum |
|---|---|
experiment_title |
5 characters |
experiment_desc |
10 characters |
analysis_desc |
10 characters |
Retrieve by ID
# Single object
ts = repo.retrieve(42)
# Multiple objects
ts1, ts2 = repo.retrieve(42, 43)
# All objects in a collection
objects = repo.retrieve(cid=7)
Retrieve path selection (cross-compatibility)
ltpda auto-detects the submission origin from the objmeta.version field and
chooses the optimal deserialisation path automatically:
| Object origin | Path chosen |
|---|---|
| Python (ltpda) | HDF5 binary in bobjs.mat → fallback to XML if missing |
| MATLAB | XML in objs.xml directly (MATLAB .mat binary is not parseable by Python) |
No user action is required — repo.retrieve(id) always returns the correct
ltpda object regardless of which tool submitted it.
Time-range retrieval
Retrieve all time-series segments overlapping a window, then concatenate and crop:
from datetime import datetime
ts = repo.get(
'ACC_X', # SQL LIKE pattern on object name
t0=datetime(2024, 1, 15, 0, 0),
t1=datetime(2024, 1, 15, 6, 0),
author='alice', # optional author filter
)
# ts is a single TSData spanning exactly [t0, t1]
This is the primary method for retrieving long continuous time-series stored as multiple shorter segments — the same pattern the MATLAB toolbox uses internally.
Search
# All objects
results = repo.find()
# By name pattern
results = repo.find(name='ACC%')
# By author and time range
results = repo.find(
name='ACC_X',
author='alice',
date_from='2024-01-01',
date_to='2024-02-01',
)
# Objects whose stored timespan overlaps [t0, t1]
from datetime import datetime
results = repo.find(
name='ACC_X',
timespan=(datetime(2024, 1, 15), datetime(2024, 1, 16)),
)
for r in results:
print(r.id, r.name, r.submitted, r.t_start, r.t_stop)
Utility methods
# Full metadata (no binary download)
metas = repo.get_metadata(42, 43, 44)
print(metas[0].fs, metas[0].nsecs, metas[0].t0)
# Most recent segment for a channel
latest = repo.get_latest('ACC_X')
# ID ↔ UUID
uuid = repo.get_uuid(42)
obj_id = repo.get_id(uuid)
# Collection membership
ids = repo.get_collection_ids(7)
# Duplicate detection
dupes = repo.find_duplicates() # list of (id, uuid) pairs
# CSV report
repo.report('repo_dump.csv', date_from='2024-01-01')
# All accessible schemas
databases = repo.list_databases()
# Without opening a persistent connection
databases = LTPDARepository.available_databases('db.host.com', 'alice', 'secret')
Format compatibility
ltpda-submitted objects store their binary payload as HDF5 (.ltpda format) in the bobjs
table. The objs.xml field is set to the sentinel value binary_hdf5.
- MATLAB: can see ltpda-submitted objects in the web UI (all metadata fields are populated).
MATLAB cannot reconstruct the data itself (
binary_hdf5is not a valid LTPDA XML serialization). - ltpda: can fully retrieve both ltpda-submitted HDF5 objects and MATLAB-submitted binary
.matobjects (best-effort viascipy.io.loadmat; complex LTPDA class hierarchies may not parse correctly). XML-only MATLAB objects (binary=False) raiseNotImplementedError.
Quick start
from ltpda.tsdata import TSData
from ltpda.dsp.spectral import psd, asd
# 10000 s of white noise at 10 Hz
ts = TSData.randn(nsecs=10000, fs=10, name='noise', yunits='m')
# Power and amplitude spectral density
Pxx = psd(ts, navs=10, window='BH92')
Sxx = asd(ts, navs=10, window='BH92')
Sxx.loglog()
Documentation
Creating data objects
TSData — time-series with a sampling rate. The time axis is auto-generated from fs.
from ltpda.tsdata import TSData
# White noise
ts = TSData.randn(nsecs=1000, fs=100, name='noise', yunits='V')
# Sine wave
s = TSData.sinewave(fs=100, nsecs=10, A0=2.0, f0=1.2, phi=0, name='sine', yunits='V')
# Zeros
z = TSData.zeros(nsecs=100, fs=10, yunits='m')
XYData / FSData — general 2-D data and frequency-series.
from ltpda.xydata import XYData
from ltpda.fsdata import FSData
import numpy as np
xy = XYData(xaxis=np.linspace(0, 1, 100), yaxis=np.random.randn(100),
xname='Time', yname='Signal', xunits='s', yunits='V')
# FSData from a parametric function (e.g. noise model)
fs_noise = FSData.from_function(
lambda f: 13.5e-12**2 * (1 + (2e-3 / f)**4),
xmin=1e-4, xmax=1, npts=1000, xunits='Hz', yunits='m^2/Hz',
)
YData — scalar data with units (no x axis).
from ltpda.ydata import YData
y = YData(yaxis=3.14, yunits='m')
Arithmetic and error propagation
All arithmetic operations are supported between data objects and between objects and
scalars / numpy arrays. Gaussian errors in ddata are propagated automatically.
ts = TSData.randn(nsecs=100, fs=10, yunits='V')
ts2 = TSData.randn(nsecs=100, fs=10, yunits='V')
result = ts + ts2 # addition
ratio = ts / ts2 # division — units cancel, errors propagate
power = ts ** 3 # power — errors propagated via chain rule
scaled = ts * 10 # scalar multiply
# Attach per-sample errors and plot with error bars
ts.dyaxis = 0.1 # uniform error (shorthand for ts.yaxis.ddata = 0.1)
ts.plot(ShowErrors=True, ErrorType='area')
Units are tracked through every operation:
from ltpda.utils.unit import Unit
u = Unit('m^2 Hz^-1')
print(u.char()) # [m^(2)][Hz^(-1)]
print(u.sqrt().char()) # [m][Hz^(-1/2)]
Spectral analysis
All estimators accept a TSData input and return FSData.
from ltpda.dsp.spectral import psd, asd, csd, mscohere, cohere, tfe, logpsd
# Welch PSD / ASD
Pxx = psd(ts, navs=10, window='BH92') # Power spectral density
Sxx = asd(ts, navs=10, window='BH92') # Amplitude spectral density (= √PSD)
# Scale options: 'PSD' (default), 'ASD', 'PS', 'AS'
Sxx2 = psd(ts, navs=10, window='BH92', scale='ASD')
# Cross-spectral quantities (two inputs)
Pxy = csd(ts1, ts2, navs=10, window='BH92') # Cross-spectral density
coh = mscohere(ts1, ts2, navs=10) # Magnitude-squared coherence
ccoh = cohere(ts1, ts2, navs=10) # Complex coherence
H = tfe(ts1, ts2, navs=10, window='BH92') # Transfer function estimate
# Log-scale PSD (requires lpsd package)
lPxx = logpsd(ts)
lPxx2 = logpsd(ts, order=3) # higher-order debiasing
# Plot with errors
lPxx.sqrt().loglog(lPxx2.sqrt(), ShowErrors=True, ErrorType='area')
Nfft vs navs: Pass
Nfftto set the segment length in samples, ornavsto set the target number of averages. Both control the frequency resolution / variance trade-off.
Spectral windows
from ltpda.utils.specwin import Specwin
# List all available windows
Specwin.supportedWindows()
# ['Rectangular', 'Welch', 'Bartlett', 'Hanning', 'Hamming',
# 'Nuttall3', 'Nuttall4', 'Nuttall3a', 'Nuttall3b', 'Nuttall4a',
# 'Nuttall4b', 'Nuttall4c', 'BH92', 'SFT3F', 'SFT3M', 'FTNI',
# 'SFT4F', 'SFT5F', 'SFT4M', 'FTHP', 'HFT70', 'FTSRS', 'SFT5M',
# 'HFT90D', 'HFT95', 'HFT116D', 'HFT144D', 'HFT169D', 'HFT196D',
# 'HFT223D', 'HFT248D', 'Kaiser']
# Inspect a window
w = Specwin('BH92', N=1024)
print(w.nenbw) # Normalised equivalent noise bandwidth
print(w.psll) # Peak sidelobe level (dB)
# Kaiser window: specify sidelobe level
w_k = Specwin('Kaiser', N=1024, psll=200)
Pass the window name as a string to any spectral estimator: window='BH92'.
Pole/zero models
from ltpda.pzmodel import PZModel, PZ
import numpy as np
# PZ objects: real pole (f only) or complex pair (f + Q)
p1 = PZ(0.01, Q=2) # complex pair at 0.01 Hz, Q=2
p2 = PZ(3) # real pole at 3 Hz
z1 = PZ(0.1) # real zero at 0.1 Hz
z2 = PZ(0.2) # real zero at 0.2 Hz
pzm = PZModel(poles=[p1, p2], zeros=[z1, z2], gain=2, delay=0,
iunits='m', ounits='V')
print(pzm)
# Evaluate frequency response → returns FSData
freqs = np.logspace(-3, 1, 500)
r = pzm.resp(freqs=freqs)
r.abs().loglog()
Noise generation
Generate a time-series with a spectral shape defined by a PZModel using the Franklin
algorithm. The result can be arbitrarily long; state is maintained across calls.
from ltpda.dsp.noisegen import NoiseGen
from ltpda.dsp.spectral import logpsd
ng = NoiseGen(pzm=pzm, fs=30)
ts = ng.generateNoise(nsecs=1e5)
# Verify: compare generated spectrum against model response
S = logpsd(ts)
r = pzm.resp(freqs=S.xaxis.data)
S.sqrt().loglog(r.abs())
FIR digital filters
from ltpda.dsp.filter import FIR
# Design filters (scipy windowed-sinc method)
lp = FIR.lowpass( fc=1, gain=1, fs=10, order=32, win='blackmanharris',
iunits='V', ounits='m')
hp = FIR.highpass(fc=1, gain=1, fs=10, order=32, win='blackmanharris',
iunits='V', ounits='m')
bp = FIR.bandpass(fc=[0.01, 0.1], gain=2, fs=10, order=1024, iunits='V', ounits='m')
bs = FIR.bandstop(fc=[0.01, 0.1], gain=2, fs=10, order=1024, iunits='V', ounits='m')
# Frequency response → FSData
r = lp.resp(f1=0.1, f2=5, nf=1000)
r.loglog()
# Apply to a time-series
ts_filtered = lp.filter(ts)
Differentiation
Five numerical differentiation methods are available via TSData.diff().
s = TSData.sinewave(fs=100, nsecs=10, A0=1, f0=1.2, phi=0, yunits='V')
# method: 'diff', '2point', '3point', '5point', 'order2', 'order2Smooth'
# order: 'Zero' (smoothed input), 'First' (first derivative), 'Second'
ds1 = s.diff(method='3point', order='First')
ds2 = s.diff(method='5point', order='Second')
# order2 and order2Smooth do not take an 'order' argument
ds_o2 = s.diff(method='order2Smooth')
s.plot(ds1, ds2)
| Method | Notes |
|---|---|
diff |
Simple numpy finite difference |
2point |
Two-point stencil |
3point |
Three-point centered stencil |
5point |
Five-point centered stencil, higher accuracy |
order2 |
Polynomial fitting on irregular grids |
order2Smooth |
order2 with 5-point smoothing pass |
Splitting data
# Time-series: split by time window [start, stop] in seconds
segment = ts.split_by_time(times=[10, 60])
# Frequency-series: split by frequency range [f_low, f_high] in Hz
band = Sxx.split_by_frequency(freqs=[0.1, 1.0])
Known bug (#5):
split_by_timecurrently uses time values as sample indices rather than comparing against the actual time axis. Use with care on data that does not start at t = 0.
plotinfo — per-object style metadata
set_plotinfo() attaches persistent style and legend metadata to any ltpda object.
iplot() reads it automatically, with explicit iplot() kwargs taking priority.
ts.set_plotinfo(color='steelblue', linewidth=2.5, linestyle='--')
ts.iplot() # steelblue dashed line, width 2.5 — no extra kwargs
# Legend and error-bar control per object
sig.set_plotinfo(include_in_legend=False) # plotted but unlabelled
calib.set_plotinfo(show_errors=True) # error bars without ShowErrors=True kwarg
# Marker style
ts.set_plotinfo(marker='o', markersize=6, markerfacecolor='white', markeredgecolor='steelblue')
Full parameter list: color, linestyle, linewidth, marker, markersize,
markerfacecolor, markeredgecolor, fillmarkers, include_in_legend, show_errors.
Priority chain for each style field — iplot() kwarg → plotinfo field → object
loose attribute (obj.color, obj.linestyle, …) → matplotlib default colour cycle.
Repository interoperability — plotinfo survives XML round-trips between Python and MATLAB:
- Python retrieve of a MATLAB AO:
<Style>XML is parsed;obj._plotinfo.coloris set to a matplotlib hex string ('#ff0000'), all other fields populated from the XML attributes. - Python submit:
<Style>is generated with the exact Java color encoding MATLAB expects (Color.getRGB()signed 32-bit decimal). MATLAB retrieves a fully styled AO with the correct color, linestyle, marker, etc.
File I/O
Objects are serialised to HDF5 with a versioned format. The file extension is .ltpda.
# Save
ts.save('my_timeseries.ltpda')
# Load
from ltpda.tsdata import TSData
ts2 = TSData.load('my_timeseries.ltpda')
# Load from text file
ts3 = TSData.from_txt_file('data.txt', fs=100, yunits='V',
xcol=0, ycol=1, delimiter=',')
Core classes
Data hierarchy
YData Y-axis data with units and Gaussian error propagation
└── XYData adds an X axis (general 2-D data)
├── TSData time-series — sampling-rate aware; auto-generates time axis
└── FSData frequency-series — X units default to Hz
Supporting classes
| Class | Purpose |
|---|---|
Axis |
Wraps a numpy array with a Unit, error array (ddata), and a name |
Unit |
Symbolic unit algebra — parse, multiply, simplify, convert to SI |
Specwin |
30+ spectral window functions |
PZ |
Single pole or zero in f/Q or complex (s-plane) representation |
PZModel |
Poles, zeros, gain, and delay — evaluates to FSData via .resp() |
DFilter / FIR |
Digital filter classes with .resp() and .filter() |
NoiseGen |
Franklin-algorithm colored-noise generator driven by a PZModel |
Features
- Time and frequency series —
TSDataandFSDatawith unit tracking, error propagation, and HDF5 serialisation (.ltpdafiles, versioned format) - Physical unit algebra — parses unit strings (
"m/s^2","pm^1.5", …), multiplies, simplifies, converts to SI, and produces LaTeX axis labels - Error propagation — Gaussian errors tracked through every arithmetic operation including
+,-,*,/,**,abs,sqrt,log10,exp - Spectral estimation — Welch WOSA:
psd,asd,csd,mscohere,cohere,tfe; log-scalelogpsdvia the externallpsdlibrary; PSD / ASD / PS / AS output scaling - Spectral windows — 30+ types; each exposes NENBW, PSLL, and 3 dB bandwidth properties
- Pole/zero models —
PZModelwith frequency-response evaluation; automatic f/Q ↔ complex root conversion; complex-conjugate pole pairs handled correctly - FIR digital filters — lowpass, highpass, bandpass, bandstop; frequency response and
time-domain filtering of
TSData - Noise generation — Franklin algorithm; arbitrary spectral shape prescribed by a
PZModel; state maintained across calls for arbitrarily long sequences - Differentiation — five methods: 2-point, 3-point, 5-point, order-2 polynomial fit, and order-2 with 5-point smoothing; orders Zero, First, Second
- Resampling and fractional delay — windowed-sinc interpolation with Blackman window
- Plotting —
plot,loglog,semilogy,semilogx; complex data automatically splits into magnitude and phase panels; error bars withShowErrors=True,ErrorType='area' - File I/O —
save()/load()on all data objects;from_txt_file()andfrom_complex_txt_file()class-method constructors
Not yet implemented
- IIR filters (MATLAB
miir) plistparameter-list objects (currently plain Python keyword arguments)XYZDataclass with spectrogram support- Additional math operators on
XYData:sin,cos,tanand friends - Log-scale spectral estimators:
ltfe,lcohere, and equivalents of the remaining LTPDA lpsd family fpsder— fractional polynomial derivative (started, not finished)- Vectorised spectral functions —
psd(*ts_list)/asd(*ts_list)to operate on multiple objects at once - Axis-level method helper — a generic wrapper to apply arbitrary functions to an
Axiswith correct error propagation - Time-domain simulation / step response for
PZModel - Calibration objects and control-system design utilities
- Docstrings — help text coverage is incomplete throughout the package
Directory layout
python/
├── ltpda/
│ ├── ydata.py YData base class
│ ├── xydata.py XYData (general 2-D data)
│ ├── tsdata.py TSData (time-series, with absolute t0 support)
│ ├── fsdata.py FSData (frequency-series)
│ ├── pzmodel.py PZModel + PZ (pole/zero transfer functions)
│ ├── functions.py Module-level function wrappers
│ ├── repo/ Repository connectivity (direct MySQL / PyMySQL)
│ │ ├── __init__.py Exports LTPDARepository
│ │ ├── client.py LTPDARepository — main public API class
│ │ ├── models.py SubmitResult, ObjectMeta, SearchResult dataclasses
│ │ ├── _connection.py MySQL connection wrapper (RepoConnection)
│ │ ├── _submit.py Submit logic (mirrors MATLAB submit.m)
│ │ ├── _retrieve.py Retrieve / time-range / HDF5 deserialization
│ │ └── _search.py Search, find, metadata, report utilities
│ ├── utils/
│ │ ├── axis.py Axis — numpy array with units and errors
│ │ ├── unit.py Unit — symbolic algebra and SI conversion
│ │ ├── specwin.py Spectral windows (30+ types)
│ │ └── math/ Helper math utilities (rat, intfact, normal_round)
│ ├── dsp/
│ │ ├── filter.py TF, DFilter, FIR digital filter classes
│ │ ├── spectral.py PSD, ASD, CSD, coherence, TFE estimators
│ │ └── noisegen.py Franklin noise generator
│ ├── mixins/ Composable mixins (operators, plotting, diff, DSP)
│ └── Examples/ Jupyter notebooks (submit, retrieve, time-range examples)
├── docker/ Dockerfile for CI / containerised testing
└── tests/ pytest test suite (~54% coverage)
Development
Run the tests
make test
# or
poetry run pytest
All tests must pass and coverage must not drop below 54 %.
Docker
A docker/Dockerfile builds a self-contained Python environment with ltpda installed (Python 3.10 by default, also tested against 3.7). The Makefile provides helpers:
make docker # build gwdiexp/ltpda:develop (and :develop-3.10)
make docker-push # push both tags to Docker Hub
make test-docker # run the test suite inside the container
The Docker image is primarily used for CI. To run tests in the container locally:
docker run -v $(pwd):/code --rm -it gwdiexp/ltpda:develop make test
Code style
Black (88-character lines), isort, pylint, and mypy are enforced via pre-commit. The hooks run automatically before each commit once enabled:
poetry run pre-commit install
Release a new version
poetry version patch # bug fixes
poetry version minor # new features
poetry version major # breaking changes
Then merge to main.
Open design questions
These architectural decisions are unresolved and worth settling before the relevant areas grow further:
-
Plotter separation — plotting methods (
plot,loglog, …) currently live as mixins on the data classes. An alternative is a standaloneTSPlotter/FSPlotterclass:tsplt.loglog(ts1, ts2, ts3). This would decouple visualisation from data and make the classes easier to test. -
Spectral and filter mixins —
psd,asd,tfe, and filter application currently live in separate modules. Since they only operate onTSData, mixing them directly ontoTSData(likeTSDataDSP) would givets.psd(navs=10)call syntax. Trade-off: convenience vs separation of concerns. -
Setter validation in
Axis— input checking fordata,ddata, andunitsis spread across the data classes. Moving it intoAxis.__set__would centralise validation and make subclassing safer.
Known issues
The following open issues are tracked upstream at gitlab.com/pyda-group/pyda/-/issues.
Bugs:
-
#6 —
ydata / ydataraisesWrongSizeExceptionDivision between twoXYData/YDataobjects fails due to a unit exponent list length mismatch. Workaround: divide the underlying numpy arrays directly. -
#5 —
split_by_timeuses indices instead of time values Start/stop times are multiplied byfsand used as sample indices rather than compared against the actual time axis. Results are incorrect for data that does not start at t = 0. -
#23 —
numpy.array * YDatacallsYData.__mul__element-wise When a numpy array is the left operand, Python dispatches multiplication toYData.__mul__repeatedly rather than treating the array as a single operand. Operator test coverage is incomplete.
Design limitations:
-
#11 — No vectorised operations on lists of objects There is no array-of-objects type. Calling
.plot()on a Python list ofTSDataobjects requiresmy_list[0].plot(*my_list[1:])as a workaround. -
Processing history (partial implementation) — Every ltpda object carries a
.historyattribute that records Python-side operations (constructor, arithmetic, DSP, repo retrieve/submit). When retrieving a MATLAB-submitted object, the full LTPDA history chain is parsed from<historyRoot>XML and accessible asobj.history. Known limitations:hist2py()(code reconstruction from history) is not implemented.- When MATLAB retrieves a ltpda-processed object, Python steps appear in MATLAB's history
browser but
hist2m()cannot reconstruct them — it produces comments for Python nodes. - History is not stored in the ltpda HDF5 format; it lives only in the XML serialisation
(
objs.xml). Objects retrieved via the HDF5 path (ltpda-submitted) carry only their Python-side history, not any MATLAB chain.
from ltpda.history import display as show_history # Python-tracked history ts = TSData.randn(nsecs=100, fs=10) ts2 = ts * 2.0 show_history(ts2.history) # [py] mul 2024-01-15 00:01:02 # [py] TSData.constructor 2024-01-15 00:01:00 # MATLAB history (after repo.retrieve on a MATLAB-submitted ao) lpsd_obj = repo.retrieve(42) show_history(lpsd_obj.history) # [py] repo.retrieve 2024-01-15 00:02:00 # [ml] lpsd 2024-01-15 00:00:58 # [ml] plus 2024-01-14 23:59:50
Enhancements under discussion:
-
#9 — Replace
ddatawith theuncertaintieslibrary Proposal to useuncertainties.uarrayinstead of separate data/error arrays for more transparent error propagation. -
#8 — Object
__str__should show data valuesprint(ts)currently shows shape only. Request to show first/last values following the numpy convention. -
#7 — Mixed-unit plots should warn Plotting objects with incompatible units silently produces a misleading axis label. Request to display
[Mixed]or raise a warning. -
#3 — Package name is taken on PyPI (resolved — package renamed to ) The name was already registered on PyPI by an unrelated project. Resolved by renaming this package to .
Version history
0.2.3
iplot()— intelligent plot method mimicking MATLAB'sao.iplot:- Smart data-type dispatch:
TSData→ linear axes;FSData→ log-log with automatic magnitude/phase subplots for complex data. Arrangement='stacked'(default) overlays all objects on the same axes.Arrangement='subplots'stacks each object in its own subplot row (single figure).Arrangement='single'opens one figure per object.XScales/YScales— per-axis scale override ('log'or'lin'); a single string applies to all axes.XRanges/YRanges— per-axis[min, max]limits.LineColors,LineStyles,LineWidths,Markers,MarkerSizes— per-object style control; shorter lists cycle;['all', value]applies one value to every trace.MarkerFaceColor,MarkerEdgeColor— independent marker fill and border colours; same['all', colour]shorthand supported.Legends='off'suppresses legends;Legends=['a', 'b']overrides labels;LegendLocationaccepts MATLAB location strings ('NorthEast','Best', …);LegendFontSizecontrols font size;ShowDescriptions=Trueappends the object's.descriptionattribute to the legend label.Titles— per-subplot title strings (one per object in subplots/single arrangements).XLabels/YLabels— override axis label names; data units are still appended.FigureNames— set the figure suptitle / window title.complexPlotType— controls complex-data display:'absdeg'(magnitude + phase in °, default),'absrad'(magnitude + phase in rad),'realimag'(real + imaginary parts).ShowErrors=Truerenders error bars fromddata;ErrorBarType='bar'(default) or'area'(shaded band). Explicit per-object bounds viaYerrL,YerrU,XerrL,XerrU.AUTOERRORS=Falsedisables automaticddatadetection.- All keyword names match MATLAB's
iplotexactly for zero relearning cost.
- Smart data-type dispatch:
plotinfo— per-object style metadata thatiplot()reads automatically.set_plotinfo(color, linestyle, linewidth, marker, markersize, markerfacecolor, markeredgecolor, fillmarkers, include_in_legend, show_errors)attaches aPlotInfoto any ltpda object. Priority chain:iplot()kwarg > plotinfo field > object loose attribute > matplotlib default. Full MATLAB XML round-trip: Python reads MATLAB<Style>XML on retrieve (all color, linestyle, marker fields parsed into matplotlib equivalents); Python emits exact MATLAB-compatible<Style>on submit (JavaColor.getRGB()decimal encoding).- Richer Python AO processing history — Python history nodes are now as informative
as MATLAB's and produce distinct per-operation groups in the MATLAB history browser:
- Each operation type gets its own blue cluster label instead of the generic
Python/ltpdabucket:ao.ao (Python)for constructors,ao.psd (Python)for spectral estimates,ao.plus (Python)for arithmetic, etc. - Constructor params are fully recorded:
FS,NSECS,YUNITS,WAVEFORM(forrandn/sinewave),A0,F0,PHI(forsinewave),DISTRIBUTION/SIGMA(forrandn). - DSP functions (
psd,logpsd,mscohere,cohere,cpsd,tfe) now record a history node that chains back to the input time-series, capturingWINDOW,NAVS,PERCENT_OVERLAP,NFFT,SCALE,DETREND_ORDER(andPSLL,OLAP,BMIN,LMIN,JDES,KDESforlogpsd). Previously these functions produced no history at all. NoiseGen.generateNoise()recordsNSECS,FS,MODEL,YUNITS.__pow__recordsEXPONENT.
- Each operation type gets its own blue cluster label instead of the generic
set_description(text)— explicit setter on all ltpda objects (mirrors MATLAB'ssetDescription). Thedescriptionproperty remains directly assignable; this method adds a consistentset_*style for use alongsideset_yaxis_name,set_plotinfo, etc.- Bug fixes:
- History
contextattribute was silently dropped when Python read a MATLAB-serialized history node from XML and re-submitted it. MATLAB's history browser usescontextto render "blue tag" cluster labels; losing it caused all pre-existing history steps to appear untagged after a Python round-trip. Fixed by adding a_contextfield toHistoryNodeand preserving the attribute through the full read → write cycle. proctimeon history nodes drifted by the system UTC offset on every Python round-trip._parse_history_rootwas creating naive datetimes viadatetime.utcfromtimestamp(), whichdatetime.timestamp()(in the serialiser) then treated as local time. Switched to UTC-aware datetimes (datetime.fromtimestamp(..., tz=timezone.utc)) throughout.- AO
UUIDwas not preserved on retrieve:_parse_aodiscarded theUUIDattribute from the<ao>element, so every re-submit generated a fresh random UUID. Now stamped ontoobj.idafter parsing.
- History
0.2.2
- First pypi.org release
0.2.1
- Renamed package from
pydatoltpdato resolve PyPI naming conflict (issue #3). File extension.pyda→.ltpda(.pydafiles still load for backward compatibility). Repository sentinelbinary_pyda→binary_hdf5. - Dependency updates for NumPy 2.x compatibility:
numpyuncapped (≥ 1.18),matplotlib ≥ 3.9,h5py ≥ 3.10. Addedmpmath ≥ 1.0as a runtime dependency. - Wired up
ltpda.dsp.NoiseGen(Franklin noise generator): added missingmpmathdependency, exported fromltpda.dsp, added smoke tests. - Bug fixes:
PZ()no-argument constructor crashed withTypeErrorbecausenumpy.isreal(None)isTrue, causingfq2ri(f0=None)to be called. Guarded dispatch block withif f is not None.TSData.nsecs()andTSData.fs()raisedValueError/ emitted numpy warnings on empty time-series objects. Both now return0.0early whenxdata()is empty.Axis.ddatasetter size check was gated onnumpy.shape(ddata)[0] > 2(first dimension, not total size), allowing mismatched error vectors to be silently accepted. Replaced withddata.size > 1.
- Test suite: removed three stale
@unittest.skipdecorators (bugs resolved). Excludedltpda/repo/*from coverage measurement (requires live MySQL). Coverage threshold met at 56%.
0.2.0
- Repository connectivity: MySQL backend, submit/retrieve AO objects, search interface.
- History tracking: record and replay analysis steps; XML exchange with MATLAB LTPDA.
Upstream baseline (pyda, pre-fork)
The following was already present in pyda-group/pyda before this fork was created, written by Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding:
YData,XYData,TSData,FSData— core data classes with error propagationUnit— physical unit algebra (parse, simplify, convert to SI)PZModel/PZ— pole/zero model representation and response computationSpecWin— spectral window functions (Hann, flat-top, Kaiser-Bessel, …)dsp.spectral— PSD / ASD estimation vialpsddsp.filter— digital filter representationdsp.noisegen— Franklin colored-noise generator (wired up in 0.2.1)- HDF5 save/load for all data classes
- Operator overloading (
+,-,*,/,**, comparison) with unit checking
Heritage
ltpda was created by Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding as a Python reimplementation of the LTPDA MATLAB toolbox for gravitational-wave and precision-measurement data analysis. The upstream project is maintained at gitlab.com/pyda-group/pyda.
This fork extends the upstream work for integration with the LTPDA repository stack.
Original authors:
- Martin Hewitson — martin.hewitson@aei.mpg.de
- Artem Basalaev — artem.basalaev@physik.uni-hamburg.de
- Christian Darsow-Fromm — cdarsowf@physnet.uni-hamburg.de
- Oliver Gerberding — oliver.gerberding@physik.uni-hamburg.de
Disclaimer
This software is provided "as is", without warranty of any kind, express or implied. Use at your own risk. The authors make no guarantees about correctness, fitness for a particular purpose, or continued development. See LICENSE.md for full terms.
License
Upstream pyda copyright 2022 Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding. See Heritage.
Modifications and extensions in this fork: Copyright 2026 Simon Barke.
Licensed under the Apache License, Version 2.0. See LICENSE.md.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ltpda-0.2.3.tar.gz.
File metadata
- Download URL: ltpda-0.2.3.tar.gz
- Upload date:
- Size: 6.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.4 CPython/3.10.11 Windows/10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6348495ca09860ed478a8bdc1e4c66fdec7d9bc81b730b95dff55ca226c9c6d6
|
|
| MD5 |
f8bc26d4e5269dc613a55b971641bb0c
|
|
| BLAKE2b-256 |
e0dbbb925c770d97d805a6291a1d905d7f468ffb46d67799d450663199476763
|
File details
Details for the file ltpda-0.2.3-py3-none-any.whl.
File metadata
- Download URL: ltpda-0.2.3-py3-none-any.whl
- Upload date:
- Size: 6.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.3.4 CPython/3.10.11 Windows/10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ace17b0a9a38fe67297a72781e5d2053bc563b40b987506f3a9aa707ef727022
|
|
| MD5 |
cd89726478f524783a5c46b7c5446f1c
|
|
| BLAKE2b-256 |
644b4b8d0ae7f49432323003a942870209105d5506a7e2510dce77e10dce57d4
|