Skip to main content

Marine Extremes Detection and Tracking

Project description

logo

CI codecov PyPI version Documentation Status PyPI Downloads DOI

Marine Extremes Python Package

Efficient & scalable Marine Extremes detection, identification, & tracking for Exascale Climate Data.

MarEx is a high-performance Python framework for identifying and tracking extreme oceanographic events (such as Marine Heatwaves or Acidity Extremes) in massive climate datasets. Built on advanced statistical methods and distributed computing, it processes decades of daily-resolution global ocean data with unprecedented efficiency and scalability.

Key Capabilities

  • ⚡ Extreme Performance: Process 100+ years of high-resolution daily global data in minutes
  • 🔬 Advanced Analytics: Multiple statistical methodologies for robust extreme event detection
  • 📈 Complex Event Tracking: Seamlessly handles coherent object splitting, merging, and evolution
  • 🌐 Universal Grid Support: Native support for both regular (lat/lon) grids and unstructured ocean models
  • ☁️ Cloud-Native Scaling: Identical codebase scales from laptop to a supercomputer using up to 1024+ cores
  • 🧠 Memory Efficient: Intelligent chunking and lazy evaluation for datasets larger than memory

View 20 Years of marEx Tracking (Click to expand)

https://github.com/user-attachments/assets/36ee3150-c869-4cba-be68-628dc37e4775

marEx_front


Features

Data Pre-processing Pipeline

MarEx implements a highly-optimised preprocessing pipeline powered by dask for efficient parallel computation and scaling to very large spatio-temporal datasets. Included are two complementary methods for calculating anomalies and detecting extremes:

Anomaly Calculation:

  1. Shifting Baseline — Scientifically-rigorous definition of anomalies relative to a backwards-looking rolling smoothed climatology.
  2. Detrended Baseline — Efficiently removes trend & season cycle using a 6+ coefficient model (mean, annual & semi-annual harmonics, and arbitrary polynomial trends). (Highly efficient, but this approximation may lead to biases in certain statistics.)

Extreme Detection:

  1. Hobday Extreme — Implements a similar methodology to Hobday et al. (2016) with local day-of-year specific thresholds determined based on the quantile within a rolling window.
  2. Global Extreme — Applies a global-in-time percentile threshold at each point across the entire dataset. Optionally renormalises anomalies using a 30-day rolling standard deviation. (Highly efficient, but may misrepresent seasonal variability and differs from common definitions in literature.)

Object Detection & Tracking

Object Detection:

  • Implements efficient algorithms for object detection in 2D geographical data.
  • Fully-parallelised workflow built on dask for extremely fast & larger-than-memory computation.
  • Uses morphological opening & closing to fill small holes and gaps in binary features.
  • Filters out small objects based on area thresholds.
  • Identifies and labels connected regions in binary data representing arbitrary events (e.g. SST or SSS extrema, tracer presence, eddies, etc...).
  • Performance/Scaling Test: 100 years of daily 0.25° resolution binary data with 64 cores...
    • Takes ~5 wall-minutes per century
    • Requires only 1 Gb memory per core (with dask chunks of 25 days)

Object Tracking:

  • Implements strict event tracking conditions to avoid very few, very large objects.
  • Permits temporal gaps (of T_fill days) between objects, to allow more continuous event tracking.
  • Requires objects to overlap by at least overlap_threshold fraction of the smaller objects's area to be considered the same event and continue tracking with the same ID.
  • Accounts for & keeps a history of object splitting & merging events, ensuring objects are more coherent and retain their previous identities & histories.
  • Improves upon the splitting & merging logic of Sun et al. (2023):
    • In this New Version: Partition the child object based on the parent of the nearest-neighbour cell (not the nearest parent centroid).
  • Provides much more accessible and usable tracking outputs:
    • Tracked object properties (such as area, centroid, and any other user-defined properties) are mapped into ID-time space
    • Details & Properties of all Merging/Splitting events are recorded.
    • Provides other useful information that may be difficult to extract from the large object ID field, such as:
      • Event presence in time
      • Event start/end times and duration
      • etc...
  • Performance/Scaling Test: 100 years of daily 0.25° resolution binary data with 64 cores...
    • Takes ~8 wall-minutes per decade (cf. Old Method, i.e. without merge-split-tracking, time-gap filling, overlap-thresholding, et al., but here updated to leverage dask, now takes 1 wall-minute per decade!)
    • Requires only ~2 Gb memory per core (with dask chunks of 25 days)

Visualisation

Plotting:

  • Provides a few helper functions to create pretty plots, wrapped subplots, and animations (e.g. below).

cf. Old (Basic) ID Method vs. New Tracking & Merging Algorithm:

https://github.com/user-attachments/assets/5acf48eb-56bf-43e5-bfc4-4ef1a7a90eff

Technical Architecture

Distributed Computing Stack:

  • Framework: Dask for distributed computation with asyncronous task scheduling
  • Parallelism: Multi-level spatio-temporal parallelisation
  • Memory Management: Lazy evaluation with automatic spilling and graph optimisation
  • I/O Optimisation: Zarr-based intermediate storage with compression

Performance Optimisations:

  • JIT Compilation: Numba-accelerated critical paths for numerical kernels
  • GPU Acceleration: Optional JAX backend for tensor operations
  • Sparse Operations: Custom sparse matrix algorithms for unstructured grids
  • Cache-Aware: Memory access patterns optimised for modern CPU architectures

Computational Workflow

  1. Preprocess: Remove trends & seasonal cycles and identify anomalous extremes
  2. Detect: Filter & label connected regions using morphological operations
  3. Track: Follow objects through time, handling complex evolution patterns
  4. Analyse: Extract event statistics, duration, and spatial properties

Quick Start Example

import xarray as xr
import marEx

# Load sea surface temperature data
sst = xr.open_dataset('sst_data.nc', chunks={}).sst

# Pre-process SST Data to identify extremes: cf. `01_preprocess_extremes.ipynb`
extreme_events_ds = marEx.preprocess_data(
    sst,
    threshold_percentile=95,
    method_anomaly='shifting_baseline',
    method_extreme='hobday_extreme'
)

# Identify & Track Marine Heatwaves through time: cf. `02_id_track_events.ipynb`
events_ds = marEx.tracker(
    extreme_events_ds.extreme_events,
    extreme_events_ds.mask,
    R_fill=8,
    area_filter_quartile=0.5,
    allow_merging=True
).run()

# Visualise results: cf. `03_visualise_events.ipynb`
fig, ax, im = (events_ds.ID_field > 0).mean("time").plotX.single_plot(marEx.PlotConfig(var_units="MHW Frequency", cmap="hot_r", cperc=[0, 96]))

Installation & Setup

Full Installation

# Complete HPC installation with all optional dependencies
pip install marEx[full,hpc]

Development Installation

# Clone and install for development
git clone https://github.com/wienkers/marEx.git
cd marEx
pip install -e .[dev]

# Install pre-commit hooks
pre-commit install

Getting Help

If you encounter installation issues:

  1. Documentation: Check the full documentation for detailed guides and API reference
  2. Check Dependencies: Run marEx.print_dependency_status() to identify missing components
  3. Search Issues: Check the GitHub Issues for similar problems
  4. System Information: Include your OS, Python version, and error messages when reporting issues
  5. Support: Reach out to Aaron Wienkers

Funding

This project has received funding through:

  • The EERIE (European Eddy-Rich ESMs) Project
  • The European Union's Horizon Europe research and innovation programme under Grant Agreement No. 101081383
  • The Swiss State Secretariat for Education, Research and Innovation (SERI) under contract #22.00366

Please contact Aaron Wienkers with any questions, comments, issues, or bugs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

marex-3.1.1.tar.gz (11.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

marex-3.1.1-py3-none-any.whl (102.4 kB view details)

Uploaded Python 3

File details

Details for the file marex-3.1.1.tar.gz.

File metadata

  • Download URL: marex-3.1.1.tar.gz
  • Upload date:
  • Size: 11.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for marex-3.1.1.tar.gz
Algorithm Hash digest
SHA256 b54cb6cbc77771514622a12f3b9ba689258296fc89c6a1c905cea8e593ff27e2
MD5 4a187b5fb671e60a3e4521085dd8da72
BLAKE2b-256 aebc6d6ef828bbf5978d64979343819d195a55f253c838b2a8f550af164efd19

See more details on using hashes here.

Provenance

The following attestation bundles were made for marex-3.1.1.tar.gz:

Publisher: release.yml on wienkers/marEx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file marex-3.1.1-py3-none-any.whl.

File metadata

  • Download URL: marex-3.1.1-py3-none-any.whl
  • Upload date:
  • Size: 102.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for marex-3.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8f865679855ec1a223fa49c8467a3cf10ed075457dea3654f6992ba8cd043633
MD5 01a51754a9199a2f38bcff8e73eb2841
BLAKE2b-256 e4b3d0cfa3e09069859ceb1cc1d8d75136c4bb8c4ad3de1116af3dab533f619e

See more details on using hashes here.

Provenance

The following attestation bundles were made for marex-3.1.1-py3-none-any.whl:

Publisher: release.yml on wienkers/marEx

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page