Skip to main content

Postprocessing steps from LISA Mojito L01 data for use with L2D noise analysis

Project description

mojito-processor

DOI Documentation

Postprocessing tools for LISA Mojito L01 data for use with L2D noise analysis.

Goal of package

The goal of this package is to provide a simple, modular, and well-documented set of tools for processing LISA Mojito L1 data. The package applies a signal processing pipeline (filtering, downsampling, trimming, windowing) to data loaded via the mojito package. The design emphasizes ease of use and flexibility, allowing users to customize the processing steps as needed for their specific analysis tasks.

Dependencies

This package depends on mojito, the official LISA L1 file reader. All dependencies are installed automatically via pip or uv.

Installation

pip install mojito-processor
# or
uv pip install mojito-processor

Development Setup

# Clone the repository
git clone https://github.com/OllieBurke/MojitoProcessor.git
cd MojitoProcessor

# Install uv if you haven't already
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install the package and all dependency groups
uv sync --all-groups

# Install pre-commit hooks
uv run pre-commit install

# Run pre-commit on all files (optional)
uv run pre-commit run --all-files

Quick Start

from MojitoProcessor import load_file, load_processed, process_pipeline, write

# ── Load Mojito L1 data ───────────────────────────────────────────────────────
data = load_file("mojito_data.h5")

# ── Pipeline parameters ───────────────────────────────────────────────────────

downsample_kwargs = {
    "target_fs": 0.2,      # Hz — target sampling rate (None = no downsampling)
    "kaiser_window": 31.0, # Kaiser window beta (higher = more aggressive anti-aliasing)
}

filter_kwargs = {
    "highpass_cutoff": 5e-6,                                # Hz — high-pass cutoff (always applied)
    "lowpass_cutoff": 0.8 * downsample_kwargs["target_fs"], # Hz — low-pass cutoff (None for high-pass only)
    "order": 2,                                             # Butterworth filter order
}

trim_kwargs = {
    "fraction": 0.02,  # Total fraction of data trimmed symmetrically from both ends
}

truncate_kwargs = {
    "days": 7.0,  # Segment length in days (splits dataset into non-overlapping chunks)
}

window_kwargs = {
    "window": "tukey",  # Window type: 'tukey', 'hann', 'hamming', 'blackman', 'blackmanharris', 'planck'
    "alpha": 0.0125,    # Taper fraction for Tukey/Planck windows
}

# ─────────────────────────────────────────────────────────────────────────────

processed_segments = process_pipeline(
    data,
    downsample_kwargs=downsample_kwargs,
    filter_kwargs=filter_kwargs,
    trim_kwargs=trim_kwargs,
    truncate_kwargs=truncate_kwargs,
    window_kwargs=window_kwargs,
)

# Access processed data
sp = processed_segments["segment0"]
print(f"Sampling rate: {sp.fs} Hz")
print(f"Duration:      {sp.T / 86400:.2f} days")
print(f"TCB start:     {sp.t0:.6g} s")

# ── Write to HDF5 ─────────────────────────────────────────────────────────────
write(
    "processed.h5",
    processed_segments,
    raw_data=data,
    filter_kwargs=filter_kwargs,
    downsample_kwargs=downsample_kwargs,
    trim_kwargs=trim_kwargs,
    truncate_kwargs=truncate_kwargs,
    window_kwargs=window_kwargs,
)

Alternatively, load, process, and write in a single call using the high-level pipeline:

from MojitoProcessor.pipelines import read_and_process

segments = read_and_process(
    "mojito_data.h5",
    filter_kwargs=filter_kwargs,
    downsample_kwargs=downsample_kwargs,
    trim_kwargs=trim_kwargs,
    truncate_kwargs=truncate_kwargs,
    window_kwargs=window_kwargs,
    output_path="processed.h5",
)

Features

  • Loadload_file reads LISA Mojito L1 HDF5 files via the mojito package
  • Processprocess_pipeline applies filtering, downsampling, trimming, segmentation, and windowing in a single call
  • Writewrite saves processed segments and raw auxiliary data (LTTs, orbits, noise estimates) to HDF5
  • Reloadload_processed reads a file written by write back into a dict[str, SignalProcessor], enabling deferred analysis without reprocessing
  • Pipelineread_and_process combines load, process, and write into one function, with an optional CLI interface
  • TCB time trackingt0 is propagated through every processing step, including segmentation
  • TDI channel support — XYZ and AET (via SignalProcessor.to_aet())

Building the Documentation

First, install Pandoc (required by nbsphinx to render the example notebooks):

# macOS
brew install pandoc

# Linux (Debian/Ubuntu)
sudo apt-get install pandoc

Then install the docs dependencies and build:

uv sync --group docs
uv run sphinx-build -b html docs docs/_build/html

Open docs/_build/html/index.html in your browser to view the result.

License

MIT License — see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mojito_processor-0.5.1.tar.gz (184.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mojito_processor-0.5.1-py3-none-any.whl (21.7 kB view details)

Uploaded Python 3

File details

Details for the file mojito_processor-0.5.1.tar.gz.

File metadata

  • Download URL: mojito_processor-0.5.1.tar.gz
  • Upload date:
  • Size: 184.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for mojito_processor-0.5.1.tar.gz
Algorithm Hash digest
SHA256 3bbd525552f876306cef3abd6629c2a2b47ec57c56186a141f057f771a63631f
MD5 af399f06ae736dfd3f27c4fc92a87264
BLAKE2b-256 fb29223b9dc5495495fe4cb16ab9bca169f4193fc4dcac3f0c5f30d5fc9336cd

See more details on using hashes here.

Provenance

The following attestation bundles were made for mojito_processor-0.5.1.tar.gz:

Publisher: publish.yml on OllieBurke/MojitoProcessor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mojito_processor-0.5.1-py3-none-any.whl.

File metadata

File hashes

Hashes for mojito_processor-0.5.1-py3-none-any.whl
Algorithm Hash digest
SHA256 af223a63556f558c012023c2f788fb05c78ce2cd697f2d7d86f8eb50ee58fd5f
MD5 9e96e78eb767d8a9cb81f183652ed044
BLAKE2b-256 c29bfe9583a4cd69439f2ce39253885203612e6744fc86cb896b7179912261c7

See more details on using hashes here.

Provenance

The following attestation bundles were made for mojito_processor-0.5.1-py3-none-any.whl:

Publisher: publish.yml on OllieBurke/MojitoProcessor

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page