Skip to main content

Python fork of pyda (Hewitson et al.) — LTPDA-style signal processing with repository integration.

Project description

ltpda

Python package for LTPDA-style signal processing and LTI system analysis. Fork of pyda-group/pyda, extended for integration with the LTPDA repository stack.


Overview

ltpda provides Python equivalents of the core LTPDA MATLAB toolbox objects: time-series and frequency-series data classes, spectral estimation, pole/zero models, digital filters, and a physical unit algebra. The "it just works" principle of the original MATLAB toolbox is preserved — common analysis tasks require very few lines of code, while the underlying data structures remain fully accessible for advanced use.

The package is in active development. Core signal processing is stable. Features not yet implemented include IIR filter design, plist parameter-list objects, XYZData, and full MATLAB-parity history code reconstruction (hist2py).


Requirements

  • Python 3.10 or later (tested up to 3.14)
  • numpy ≥ 1.18, scipy ≥ 1.5, matplotlib ≥ 3.0, h5py ≥ 3
  • lpsd ≥ 1.0.2 (log-scale PSD estimator — see Installation)

Installation

pip from PyPI

pip install ltpda

Full package listing: https://pypi.org/project/ltpda/

pip from wheel

Download the .whl file from the Releases page, then:

pip install ltpda-<version>-py3-none-any.whl

pip from source

No-clone option (installs directly from the git repository):

pip install git+https://github.com/LordSkippy/LTPDA.git#subdirectory=python

Or clone first:

git clone <this-repo>
cd LTPDA/python
pip install .

All dependencies, including lpsd, are installed automatically.

Developers — Poetry

cd LTPDA/python
poetry install
poetry run pre-commit install   # enable Black, isort, mypy, pylint hooks

lpsd (Apple Silicon only)

lpsd (source: git.physnet.uni-hamburg.de) installs automatically as a listed dependency. The only reason to touch it manually is a performance issue on Apple Silicon (M1/M2): lpsd contains C code that uses long double arithmetic, which on ARM is the same width as double (64-bit). The polyreg step has been observed to dominate runtime. If logpsd is unusably slow, compile lpsd from source with architecture-specific flags:

# from the lpsd source directory
gcc -arch arm64 -c -fPIC ltpda_dft.c
gcc -arch arm64 -shared -o ltpda_dft.so ltpda_dft.o

Check for long double uses throughout if contributing performance fixes for M1.


Quick start

from ltpda.tsdata import TSData
from ltpda.dsp.spectral import psd, asd

# 10000 s of white noise at 10 Hz
ts = TSData.randn(nsecs=10000, fs=10, name='noise', yunits='m')

# Power and amplitude spectral density
Pxx = psd(ts, navs=10, window='BH92')
Sxx = asd(ts, navs=10, window='BH92')
Sxx.loglog()

Documentation

Guide Contents
Data objects TSData, FSData, XYData, YData — creation, arithmetic, units, splitting
Spectral analysis psd, asd, csd, tfe, logpsd, spectral windows
Fitting linfit, polynomfit, bilinfit, lscov, polyfit, xfit, tdfit, sDomainFit, zDomainFit, ParFrac, PEst
Models & filters PZModel, FIR, NoiseGen, differentiation
Plotting & I/O iplot, plotinfo, save/load, from_txt
Repository MySQL connectivity, credentials, submit, retrieve, search

Examples

Interactive Jupyter notebooks are in Examples/. See Examples/README.md for the full index.


Core classes

Data hierarchy

YData                   Y-axis data with units and Gaussian error propagation
  └── XYData            adds an X axis (general 2-D data)
        ├── TSData      time-series — sampling-rate aware; auto-generates time axis
        └── FSData      frequency-series — X units default to Hz

Supporting classes

Class Purpose
Axis Wraps a numpy array with a Unit, error array (ddata), and a name
Unit Symbolic unit algebra — parse, multiply, simplify, convert to SI
Specwin 30+ spectral window functions
PZ Single pole or zero in f/Q or complex (s-plane) representation
PZModel Poles, zeros, gain, and delay — evaluates to FSData via .resp()
DFilter / FIR Digital filter classes with .resp() and .filter()
NoiseGen Franklin-algorithm colored-noise generator driven by a PZModel
ParFrac Partial-fraction model returned by sDomainFit / zDomainFit
PEst Parameter estimate returned by all fitting functions; supports eval() for linear fitters and xfit string expressions

Features

  • Time and frequency seriesTSData and FSData with unit tracking, error propagation, and HDF5 serialisation (.ltpda files, versioned format)
  • Physical unit algebra — parses unit strings ("m/s^2", "pm^1.5", …), multiplies, simplifies, converts to SI, and produces LaTeX axis labels
  • Error propagation — Gaussian errors tracked through every arithmetic operation including +, -, *, /, **, abs, sqrt, log10, exp
  • Spectral estimation — Welch WOSA: psd, asd, csd, mscohere, cohere, tfe; log-scale logpsd via the external lpsd library; PSD / ASD / PS / AS output scaling
  • Spectral windows — 30+ types; each exposes NENBW, PSLL, and 3 dB bandwidth properties
  • Pole/zero modelsPZModel with frequency-response evaluation; automatic f/Q ↔ complex root conversion; complex-conjugate pole pairs handled correctly
  • FIR digital filters — lowpass, highpass, bandpass, bandstop; frequency response and time-domain filtering of TSData
  • Noise generation — Franklin algorithm; arbitrary spectral shape prescribed by a PZModel; state maintained across calls for arbitrarily long sequences
  • Fitting — five linear/polynomial methods: linfit (straight-line), polynomfit (arbitrary-power polynomial), bilinfit (multilinear), lscov (direct design-matrix), polyfit (descending-order numpy wrapper); all support weighted least squares; general nonlinear least-squares (xfit); time-domain system ID (tdfit); s-domain and z-domain vector fitting (sDomainFit, zDomainFitParFrac); all mirror MATLAB LTPDA parameter names exactly; PEst.eval() reconstructs the fitted curve from any linear fit result or xfit string expression (mirrors MATLAB pest.eval())
  • Differentiation — five methods: 2-point, 3-point, 5-point, order-2 polynomial fit, and order-2 with 5-point smoothing; orders Zero, First, Second
  • Resampling and fractional delay — windowed-sinc interpolation with Blackman window
  • Plottingplot, loglog, semilogy, semilogx; complex data automatically splits into magnitude and phase panels; error bars with ShowErrors=True, ErrorType='area'
  • File I/Osave() / load() on all data objects; from_txt_file() and from_complex_txt_file() class-method constructors

Not yet implemented

  • IIR filters (MATLAB miir)
  • plist parameter-list objects (currently plain Python keyword arguments)
  • XYZData class with spectrogram support
  • Additional math operators on XYData: sin, cos, tan and friends
  • Log-scale spectral estimators: ltfe, lcohere, and equivalents of the remaining LTPDA lpsd family
  • fpsder — fractional polynomial derivative (started, not finished)
  • Vectorised spectral functions — psd(*ts_list) / asd(*ts_list) to operate on multiple objects at once
  • Axis-level method helper — a generic wrapper to apply arbitrary functions to an Axis with correct error propagation
  • Time-domain simulation / step response for PZModel
  • Calibration objects and control-system design utilities
  • Docstrings — help text coverage is incomplete throughout the package

Directory layout

python/
├── ltpda/
│   ├── ydata.py          YData base class
│   ├── xydata.py         XYData (general 2-D data)
│   ├── tsdata.py         TSData (time-series, with absolute t0 support)
│   ├── fsdata.py         FSData (frequency-series)
│   ├── pzmodel.py        PZModel + PZ (pole/zero transfer functions)
│   ├── parfrac.py        ParFrac (partial-fraction model — sDomainFit/zDomainFit output)
│   ├── pest.py           PEst (parameter estimate — xfit/tdfit output)
│   ├── functions.py      Module-level function wrappers
│   ├── repo/             Repository connectivity (direct MySQL / PyMySQL)
│   │   ├── __init__.py   Exports LTPDARepository
│   │   ├── client.py     LTPDARepository — main public API class
│   │   ├── models.py     SubmitResult, ObjectMeta, SearchResult dataclasses
│   │   ├── _connection.py  MySQL connection wrapper (RepoConnection)
│   │   ├── _submit.py    Submit logic (mirrors MATLAB submit.m)
│   │   ├── _retrieve.py  Retrieve / time-range / HDF5 deserialization
│   │   └── _search.py    Search, find, metadata, report utilities
│   ├── utils/
│   │   ├── axis.py       Axis — numpy array with units and errors
│   │   ├── unit.py       Unit — symbolic algebra and SI conversion
│   │   ├── specwin.py    Spectral windows (30+ types)
│   │   └── math/         Helper math utilities (rat, intfact, normal_round)
│   ├── dsp/
│   │   ├── filter.py     TF, DFilter, FIR digital filter classes
│   │   ├── spectral.py   PSD, ASD, CSD, coherence, TFE estimators
│   │   ├── noisegen.py   Franklin noise generator
│   │   └── fit.py        linfit, polynomfit, bilinfit, lscov, polyfit, xfit, tdfit, sDomainFit, zDomainFit
│   └── mixins/           Composable mixins (operators, plotting, diff, DSP)
├── Documentation/        Per-topic user guides
│   ├── data-objects.md
│   ├── spectral-analysis.md
│   ├── fitting.md
│   ├── models-filters.md
│   ├── plotting-io.md
│   └── repository.md
├── Examples/             26 Jupyter notebooks covering all major features
├── docker/               Dockerfile for CI / containerised testing
└── tests/                pytest test suite (~54% coverage)

Development

Run the tests

make test
# or
poetry run pytest

All tests must pass and coverage must not drop below 54 %.

Docker

A docker/Dockerfile builds a self-contained Python environment with ltpda installed (Python 3.10 by default, also tested against 3.7). The Makefile provides helpers:

make docker         # build gwdiexp/ltpda:develop (and :develop-3.10)
make docker-push    # push both tags to Docker Hub
make test-docker    # run the test suite inside the container

The Docker image is primarily used for CI. To run tests in the container locally:

docker run -v $(pwd):/code --rm -it gwdiexp/ltpda:develop make test

Code style

Black (88-character lines), isort, pylint, and mypy are enforced via pre-commit. The hooks run automatically before each commit once enabled:

poetry run pre-commit install

Release a new version

poetry version patch   # bug fixes
poetry version minor   # new features
poetry version major   # breaking changes

Then merge to main.

Open design questions

These architectural decisions are unresolved and worth settling before the relevant areas grow further:

  • Plotter separation — plotting methods (plot, loglog, …) currently live as mixins on the data classes. An alternative is a standalone TSPlotter / FSPlotter class: tsplt.loglog(ts1, ts2, ts3). This would decouple visualisation from data and make the classes easier to test.

  • Spectral and filter mixinspsd, asd, tfe, and filter application currently live in separate modules. Since they only operate on TSData, mixing them directly onto TSData (like TSDataDSP) would give ts.psd(navs=10) call syntax. Trade-off: convenience vs separation of concerns.

  • Setter validation in Axis — input checking for data, ddata, and units is spread across the data classes. Moving it into Axis.__set__ would centralise validation and make subclassing safer.


Known issues

The following open issues are tracked upstream at gitlab.com/pyda-group/pyda/-/issues.

Bugs:

  • #6ydata / ydata raises WrongSizeException Division between two XYData / YData objects fails due to a unit exponent list length mismatch. Workaround: divide the underlying numpy arrays directly.

  • #5split_by_time uses indices instead of time values Start/stop times are multiplied by fs and used as sample indices rather than compared against the actual time axis. Results are incorrect for data that does not start at t = 0.

  • #23numpy.array * YData calls YData.__mul__ element-wise When a numpy array is the left operand, Python dispatches multiplication to YData.__mul__ repeatedly rather than treating the array as a single operand. Operator test coverage is incomplete.

Design limitations:

  • #11 — No vectorised operations on lists of objects There is no array-of-objects type. Calling .plot() on a Python list of TSData objects requires my_list[0].plot(*my_list[1:]) as a workaround.

  • Processing history (partial implementation) — Every ltpda object carries a .history attribute that records Python-side operations (constructor, arithmetic, DSP, repo retrieve/submit). When retrieving a MATLAB-submitted object, the full LTPDA history chain is parsed from <historyRoot> XML and accessible as obj.history. Known limitations:

    • hist2py() (code reconstruction from history) is not implemented.
    • When MATLAB retrieves a ltpda-processed object, Python steps appear in MATLAB's history browser but hist2m() cannot reconstruct them — it produces comments for Python nodes.
    • History is carried in both HDF5 and XML since 0.2.4. Mixed MATLAB/Python chains are fully preserved through HDF5 round-trips.
    from ltpda.history import display as show_history
    
    # Python-tracked history
    ts = TSData.randn(nsecs=100, fs=10)
    ts2 = ts * 2.0
    show_history(ts2.history)
    # [py] mul  2024-01-15 00:01:02
    #   [py] TSData.constructor  2024-01-15 00:01:00
    
    # MATLAB history (after repo.retrieve on a MATLAB-submitted ao)
    lpsd_obj = repo.retrieve(42)
    show_history(lpsd_obj.history)
    # [py] repo.retrieve  2024-01-15 00:02:00
    #   [ml] lpsd  2024-01-15 00:00:58
    #     [ml] plus  2024-01-14 23:59:50
    

Enhancements under discussion:

  • #9 — Replace ddata with the uncertainties library Proposal to use uncertainties.uarray instead of separate data/error arrays for more transparent error propagation.

  • #8 — Object __str__ should show data values print(ts) currently shows shape only. Request to show first/last values following the numpy convention.

  • #7 — Mixed-unit plots should warn Plotting objects with incompatible units silently produces a misleading axis label. Request to display [Mixed] or raise a warning.

  • #3 — Package name is taken on PyPI (resolved — package renamed to ) The name was already registered on PyPI by an unrelated project. Resolved by renaming this package to .


Version history

0.2.7

  • PEst.eval() for xfit string expressions — mirrors MATLAB exactly. In MATLAB, string Function arguments are converted to an smodel and stored in pest.models, enabling pest.eval(). Python now does the same: string expressions are stored in _model_info and result.eval(data) works identically to the linear fitting methods. Callable Function arguments remain non-serialisable; reconstruct manually with my_fn(data.xdata(), result.y).
  • Examples/ltpda_fitting.ipynb updated: xfit-tsdata-plot now uses result_str.eval(ts) instead of manual reconstruction; sections 6a and 6c switched from lambdas to string expressions and use eval(); the introduction callout now shows a MATLAB-parity table distinguishing string vs callable behaviour.
  • Documentation/fitting.mdxfit string expression section updated to document eval() support; algorithm default corrected to 'nelder-mead'; PEst.eval() section updated to list all supported model types.
  • PEst.eval(xdata) — reconstructs the fitted curve from a PEst parameter estimate as a new data object, mirroring MATLAB's pest.eval(). Supported for all five linear fitting methods (linfit, polynomfit, bilinfit, lscov, polyfit); eval() for xfit string expressions was added.
    • Single-input models (linfit, polynomfit, polyfit): result.eval(ts)TSData, result.eval(fs)FSData, result.eval(x_arr)numpy.ndarray.
    • Multi-input models (bilinfit, lscov): result.eval([x1, x2]) — list of inputs, return type mirrors the primary (first) input.
    • polyfit(rescale=True) normalisation is applied automatically at eval time — no manual (x − mean) / std needed.
    • Output carries a history node (method='eval') chaining back to the fit node, matching MATLAB's history chain exactly.
    • PEst.__str__() now includes a model: line showing the stored expression (e.g. P1·X^0 + P2·X^1).
  • All five linear fitting methods and eval() work on any XYData subclass (TSData, FSData, …) — the method is defined on the XYData base class.
  • Examples/ltpda_linfit_testing.ipynb updated: all manual polynomial evaluations replaced with result.eval(); a new FSData (XYData-compatibility) cell added at the end of each of the five sections.
  • linfit — straight-line weighted least-squares fit (Y = P1 + P2·X); mirrors MATLAB @ao/linfit. Available as ts.linfit() or linfit(ts) / linfit(x_arr, y_arr).
  • polynomfit — arbitrary-power polynomial fit (e.g. orders=(-2, 0, 1) fits a/x² + b + c·x); supports per-point X and Y uncertainties with automatic error propagation; mirrors MATLAB @ao/polynomfit.
  • bilinfit — multilinear fit Y = X1·P1 + X2·P2 + … + P(N+1) with implicit constant term and optional weighted least squares; mirrors MATLAB @ao/bilinfit.
  • lscov — direct design-matrix fit (no constant) with precision-weight or full covariance-matrix weighting; column uncertainty propagation when input columns carry .dy; mirrors MATLAB @ao/lscov.
  • polyfit — descending-order polynomial fit via numpy.polyfit; optional X rescaling for numerical stability when X is far from zero; mirrors MATLAB @ao/polyfit.
  • All five methods available as methods on TSData / FSData and as standalone functions in ltpda.dsp.fit; all record processing history with UPPERCASE parameter keys.
  • Documentation/fitting.md restructured: linear methods first, nonlinear / vector fitting last; new section explaining both model-function syntaxes (callable lambda x, P: … and MATLAB-derived string 'P[0] + P[1]*Xdata') with side-by-side examples.

0.2.6

  • TSData.from_function() — construct a time-series by evaluating a Python expression of t (1-D numpy array). Mirrors the MATLAB ao(plist('tsfcn', '<expr>', 'fs', fs, 'nsecs', T)) constructor. Formula string has t, numpy, and np in scope, e.g. fcn='0.01 * numpy.random.randn(len(t))'.
  • ltpda_fitting.ipynb extended with Section 6 — five additional xfit usage patterns ported from MATLAB's test_ao_xfit.m, including fsfcn, tsfcn, xyfcn, smodel (lambda equivalent), and multi-channel sequential fits. Also demonstrates scipy.optimize.differential_evolution as the Python equivalent of MATLAB's MonteCarlo/Npoints global search.
  • sDomainFit / zDomainFit — s-domain and z-domain vector fitting of frequency-domain data using the relaxed vector fitting algorithm (Gustavsen 1999/2006). Returns a ParFrac partial-fraction model. ParFrac.resp(freqs) evaluates the model; ParFrac.to_ba() converts a z-domain model to rational filter coefficients.
  • xfit — general nonlinear least-squares curve fitting for FSData and TSData. Accepts a callable f(x, P) or a MATLAB-style eval string with Xdata / P variables. Returns a PEst with best-fit parameters, 1-sigma uncertainties, and covariance matrix.
  • tdfit — time-domain system identification: estimates a transfer function from input/output TSData via tfe(), then fits a parametric model using xfit. Returns PEst.
  • ParFrac and PEst — new output classes (mirror MATLAB's parfrac and pest), exported from ltpda and carrying full processing history.
  • All four methods are available both as methods on data objects and as standalone functions in ltpda.dsp.fit. Parameter names match MATLAB exactly.
  • pytest updated to ^8.0 for Python 3.12+ compatibility (ast.Str removal).

0.2.5

Full XML / HDF5 parity with MATLAB's .mat and objs.xml formats. After this release Python-generated XML and HDF5 files carry the same information as MATLAB-generated ones in all fields except toffset (always 0; MATLAB bakes t0 + toffset into t0 before writing XML).

New fields on data objects:

  • FSData.t0 — MATLAB's fsdata carries an optional UTC start time (set by ao/lpsd). FSData now has a t0 attribute (datetime or None), stored in HDF5 as an ISO string and in XML as <t0><time utc_epoch_milli="..."/></t0>, matching TSData's existing behaviour.
  • procinfo pass-through — MATLAB's <procinfo> plist (e.g. the lpsd frequency plan r, m, L, K) is now captured as raw XML in _procinfo_raw on retrieve, re-emitted verbatim on submit, and stored in HDF5 so it survives an XML→HDF5→XML round-trip.
  • timespan — MATLAB auto-sets timespan = (t0 + x(1), t0 + x(end) + 1/fs) on every TSData. Python replicates this in the TSData constructor when t0 is known and stores it as obj.timespan = (startT, endT). Survives HDF5 and XML round-trips.
  • FSData.fs — stores the original time-series sample rate that produced the spectrum (e.g. 4096 Hz for a 0–2048 Hz PSD), matching MATLAB's fsdata.fs. All spectral functions (psd, cpsd, tfe, mscohere, cohere, logpsd) set Sxx.fs automatically. Defaults to 0.0 for FSData objects created without a time-series input (filter responses, from_txt_file, etc.). Previously Python wrote x_data[-1] (max frequency) to the XML <fs> element; now the true sample rate is written.
  • navs and enbw on FSData — spectral estimation now computes and stores these on the returned FSData, matching MATLAB's fsdata:
    • psd, cpsd, tfe, mscohere, cohere: navs = actual Welch segments; enbw = 1-element array = fs · S2 / S1².
    • logpsd: navs = desired averages (Kdes); enbw = per-bin array read directly from the lpsd library output, matching MATLAB's ao/lpsd per-bin vector.
    • FSData.enbw is always a numpy.ndarray (1-element for WOSA, N-element for logpsd).

HDF5 fields added to TSData group (all recomputable, stored for direct inspection): fs, nsecs, toffset (= 0), timespan_start / timespan_end, procinfo_raw.

HDF5 fields added to FSData group: fs (original time-series sample rate).

Breaking changes:

  • TSData.fs is now a property (access as ts.fs, not ts.fs()). This makes it consistent with FSData.fs and with MATLAB's convention where both are properties. Any external code calling ts.fs() must be updated to ts.fs.
  • TSData.nsecs is now a property (access as ts.nsecs, not ts.nsecs()). Any external code calling ts.nsecs() must be updated to ts.nsecs.

New methods:

  • FSData.rms() — fixed and redesigned. Now mirrors MATLAB's @ao/rms: returns a cumulative RMS curve as an FSData with the same frequency axis. The spectral type (ASD or PSD) is inferred automatically from y-axis units (Hz exponent −0.5 → ASD, −1.0 → PSD). Output units are the base physical unit X: for ASD (X/√Hz) the Hz factor is stripped; for PSD (X²/Hz) the Hz factor is stripped and remaining exponents are halved (√(X²) = X). Raises ValueError for non-spectral-density units.
  • FSData.rms_scalar() — new method. Integrates the full spectrum and returns a single RMS value as a YData scalar. Uses the same unit auto-detection and unit correction as rms().

Bug fixes:

  • FSData.rms() was completely broken: referenced an undefined variable s_sc_phi and used wrong YData constructor parameters.
  • Error messages in YData arithmetic operators displayed <bound method> instead of a length count (missing () on t1.size); fixed.
  • psd(..., scale='ASD') and psd(..., scale='AS') crashed with a ValueError broadcast error whenever only one Welch segment was computed (e.g. nfft ≥ signal length). _psdPeriodogram returns an empty (0, 0) error array for single-segment estimates; _welchscale now treats empty arrays the same as None and skips the chain-rule propagation step, matching the existing single-segment handling in _wosa.

XML fixes:

  • _parse_time_element now returns UTC-aware datetimes (timezone.utc).
  • Timespan XML schema corrected: Python now emits the double-wrapped structure <timespan><timespan shape="1x1">…</timespan></timespan> that MATLAB's getObject dispatcher requires. The single-wrapped form that was previously generated would have caused MATLAB to call feval('startT', …) and crash on retrieve.
  • Timespan XML parse corrected: parser now descends into the inner <timespan> child before looking for startT/endT; previously silently dropped MATLAB-generated timespans.

HDF5 parity (introduced for all data classes: YData, XYData, TSData, FSData):

  • Processing history DAG (all nodes, all languages, MATLAB round-trip fields).
  • PlotInfo (all 10 style fields).
  • Compound units (strs, exps, vals subgroup; legacy string attr kept for back-compat).
  • Description (was written but never read back; fixed).
  • enbw stored as HDF5 dataset (not attribute) to support per-bin vectors.

Bug fixes:

  • TSData/FSData._from_hd5f_structure silently discarded axis units and names on load (the XYData constructor resets them; fix saves and restores after construction).

0.2.4

broken release

0.2.3

  • iplot() — intelligent plot method mimicking MATLAB's ao.iplot:
    • Smart data-type dispatch: TSData → linear axes; FSData → log-log with automatic magnitude/phase subplots for complex data.
    • Arrangement='stacked' (default) overlays all objects on the same axes.
    • Arrangement='subplots' stacks each object in its own subplot row (single figure).
    • Arrangement='single' opens one figure per object.
    • XScales / YScales — per-axis scale override ('log' or 'lin'); a single string applies to all axes.
    • XRanges / YRanges — per-axis [min, max] limits.
    • LineColors, LineStyles, LineWidths, Markers, MarkerSizes — per-object style control; shorter lists cycle; ['all', value] applies one value to every trace.
    • MarkerFaceColor, MarkerEdgeColor — independent marker fill and border colours; same ['all', colour] shorthand supported.
    • Legends='off' suppresses legends; Legends=['a', 'b'] overrides labels; LegendLocation accepts MATLAB location strings ('NorthEast', 'Best', …); LegendFontSize controls font size; ShowDescriptions=True appends the object's .description attribute to the legend label.
    • Titles — per-subplot title strings (one per object in subplots/single arrangements).
    • XLabels / YLabels — override axis label names; data units are still appended.
    • FigureNames — set the figure suptitle / window title.
    • complexPlotType — controls complex-data display: 'absdeg' (magnitude + phase in °, default), 'absrad' (magnitude + phase in rad), 'realimag' (real + imaginary parts).
    • ShowErrors=True renders error bars from ddata; ErrorBarType='bar' (default) or 'area' (shaded band). Explicit per-object bounds via YerrL, YerrU, XerrL, XerrU. AUTOERRORS=False disables automatic ddata detection.
    • All keyword names match MATLAB's iplot exactly for zero relearning cost.
  • plotinfo — per-object style metadata that iplot() reads automatically. set_plotinfo(color, linestyle, linewidth, marker, markersize, markerfacecolor, markeredgecolor, fillmarkers, include_in_legend, show_errors) attaches a PlotInfo to any ltpda object. Priority chain: iplot() kwarg > plotinfo field > object loose attribute > matplotlib default. Full MATLAB XML round-trip: Python reads MATLAB <Style> XML on retrieve (all color, linestyle, marker fields parsed into matplotlib equivalents); Python emits exact MATLAB-compatible <Style> on submit (Java Color.getRGB() decimal encoding).
  • Richer Python AO processing history — Python history nodes are now as informative as MATLAB's and produce distinct per-operation groups in the MATLAB history browser:
    • Each operation type gets its own blue cluster label instead of the generic Python/ltpda bucket: ao.ao (Python) for constructors, ao.psd (Python) for spectral estimates, ao.plus (Python) for arithmetic, etc.
    • Constructor params are fully recorded: FS, NSECS, YUNITS, WAVEFORM (for randn / sinewave), A0, F0, PHI (for sinewave), DISTRIBUTION / SIGMA (for randn).
    • DSP functions (psd, logpsd, mscohere, cohere, cpsd, tfe) now record a history node that chains back to the input time-series, capturing WINDOW, NAVS, PERCENT_OVERLAP, NFFT, SCALE, DETREND_ORDER (and PSLL, OLAP, BMIN, LMIN, JDES, KDES for logpsd). Previously these functions produced no history at all.
    • NoiseGen.generateNoise() records NSECS, FS, MODEL, YUNITS.
    • __pow__ records EXPONENT.
  • set_description(text) — explicit setter on all ltpda objects (mirrors MATLAB's setDescription). The description property remains directly assignable; this method adds a consistent set_* style for use alongside set_yaxis_name, set_plotinfo, etc.
  • Bug fixes:
    • History context attribute was silently dropped when Python read a MATLAB-serialized history node from XML and re-submitted it. MATLAB's history browser uses context to render "blue tag" cluster labels; losing it caused all pre-existing history steps to appear untagged after a Python round-trip. Fixed by adding a _context field to HistoryNode and preserving the attribute through the full read → write cycle.
    • proctime on history nodes drifted by the system UTC offset on every Python round-trip. _parse_history_root was creating naive datetimes via datetime.utcfromtimestamp(), which datetime.timestamp() (in the serialiser) then treated as local time. Switched to UTC-aware datetimes (datetime.fromtimestamp(..., tz=timezone.utc)) throughout.
    • AO UUID was not preserved on retrieve: _parse_ao discarded the UUID attribute from the <ao> element, so every re-submit generated a fresh random UUID. Now stamped onto obj.id after parsing.

0.2.2

  • First pypi.org release

0.2.1

  • Renamed package from pyda to ltpda to resolve PyPI naming conflict (issue #3). File extension .pyda.ltpda (.pyda files still load for backward compatibility). Repository sentinel binary_pydabinary_hdf5.
  • Dependency updates for NumPy 2.x compatibility: numpy uncapped (≥ 1.18), matplotlib ≥ 3.9, h5py ≥ 3.10. Added mpmath ≥ 1.0 as a runtime dependency.
  • Wired up ltpda.dsp.NoiseGen (Franklin noise generator): added missing mpmath dependency, exported from ltpda.dsp, added smoke tests.
  • Bug fixes:
    • PZ() no-argument constructor crashed with TypeError because numpy.isreal(None) is True, causing fq2ri(f0=None) to be called. Guarded dispatch block with if f is not None.
    • TSData.nsecs() and TSData.fs() raised ValueError / emitted numpy warnings on empty time-series objects. Both now return 0.0 early when xdata() is empty.
    • Axis.ddata setter size check was gated on numpy.shape(ddata)[0] > 2 (first dimension, not total size), allowing mismatched error vectors to be silently accepted. Replaced with ddata.size > 1.
  • Test suite: removed three stale @unittest.skip decorators (bugs resolved). Excluded ltpda/repo/* from coverage measurement (requires live MySQL). Coverage threshold met at 56%.

0.2.0

  • Repository connectivity: MySQL backend, submit/retrieve AO objects, search interface.
  • History tracking: record and replay analysis steps; XML exchange with MATLAB LTPDA.

Upstream baseline (pyda, pre-fork)

The following was already present in pyda-group/pyda before this fork was created, written by Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding:

  • YData, XYData, TSData, FSData — core data classes with error propagation
  • Unit — physical unit algebra (parse, simplify, convert to SI)
  • PZModel / PZ — pole/zero model representation and response computation
  • SpecWin — spectral window functions (Hann, flat-top, Kaiser-Bessel, …)
  • dsp.spectral — PSD / ASD estimation via lpsd
  • dsp.filter — digital filter representation
  • dsp.noisegen — Franklin colored-noise generator (wired up in 0.2.1)
  • HDF5 save/load for all data classes
  • Operator overloading (+, -, *, /, **, comparison) with unit checking

Heritage

ltpda was created by Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding as a Python reimplementation of the LTPDA MATLAB toolbox for gravitational-wave and precision-measurement data analysis. The upstream project is maintained at gitlab.com/pyda-group/pyda.

This fork extends the upstream work for integration with the LTPDA repository stack.

Original authors:


Disclaimer

This software is provided "as is", without warranty of any kind, express or implied. Use at your own risk. The authors make no guarantees about correctness, fitness for a particular purpose, or continued development. See LICENSE.md for full terms.


License

Upstream pyda copyright 2022 Martin Hewitson, Artem Basalaev, Christian Darsow-Fromm, and Oliver Gerberding. See Heritage.

Modifications and extensions in this fork: Copyright 2026 Simon Barke.

Licensed under the Apache License, Version 2.0. See LICENSE.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ltpda-0.2.7.tar.gz (144.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ltpda-0.2.7-py3-none-any.whl (160.1 kB view details)

Uploaded Python 3

File details

Details for the file ltpda-0.2.7.tar.gz.

File metadata

  • Download URL: ltpda-0.2.7.tar.gz
  • Upload date:
  • Size: 144.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.4 CPython/3.10.11 Windows/10

File hashes

Hashes for ltpda-0.2.7.tar.gz
Algorithm Hash digest
SHA256 ff1973c0ffd422d038abe0231717d5c0cf41298e00e7997d92889cd32803fc86
MD5 376508241ad12964504d9dfdb895dbac
BLAKE2b-256 15c6b7cea9dafee93deb22a71dc9432d584ba3f021c6518bcff86b82850ea9d2

See more details on using hashes here.

File details

Details for the file ltpda-0.2.7-py3-none-any.whl.

File metadata

  • Download URL: ltpda-0.2.7-py3-none-any.whl
  • Upload date:
  • Size: 160.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.4 CPython/3.10.11 Windows/10

File hashes

Hashes for ltpda-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 faf89521dabdd0171d7eddcba887b6aee0a7367d7cd028c0c6fb913b35619877
MD5 973b41c4cb7c867f877859309a2cac7f
BLAKE2b-256 19f592f108b43e983fd997c79da1fa2559ee6208f851da37cad859f28f8a408d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page