Skip to main content

Evaluating the Robustness of Time-Series Causal Discovery in Misspecified Scenarios

Project description

CausalCompass: Evaluating the Robustness of Time-Series Causal Discovery in Misspecified Scenarios

arXiv Documentation PyPI License: MIT

CausalCompass is a Python package that provides a flexible and extensible benchmark suite for evaluating the robustness of time-series causal discovery (TSCD) methods under misspecified modeling assumptions. For more details, please refer to the Document.

AUPRC Performance Radar Plot


Abstract

Causal discovery from time series is a fundamental task in machine learning. However, its widespread adoption is hindered by a reliance on untestable causal assumptions and by the lack of robustness-oriented evaluation in existing benchmarks. To address these challenges, we propose CausalCompass, a flexible and extensible benchmark suite designed to assess the robustness of time-series causal discovery (TSCD) methods under violations of modeling assumptions. To demonstrate the practical utility of CausalCompass, we conduct extensive benchmarking of representative TSCD algorithms across eight assumption-violation scenarios. Our experimental results indicate that no single method consistently attains optimal performance across all settings. Nevertheless, the methods exhibiting superior overall performance across diverse scenarios are almost invariably deep learning-based approaches. We further provide hyperparameter sensitivity analyses to deepen the understanding of these findings. We also find, somewhat surprisingly, that NTS-NOTEARS relies heavily on standardized preprocessing in practice, performing poorly in the vanilla setting but exhibiting strong performance after standardization. Finally, our work aims to provide a comprehensive and systematic evaluation of TSCD methods under assumption violations, thereby facilitating their broader adoption in real-world applications.

Key Features

  • 8 assumption-violation scenarios: Confounders, nonstationarity, measurement error, standardization, missing data, mixed data, min-max normalization, and trend/seasonality
  • 2 vanilla models: VAR (linear) and Lorenz-96 (nonlinear)
  • 11 TSCD algorithms spanning 6 major methodological categories:
    • Granger causality-based: VAR, LGC
    • Constraint-based: PCMCI
    • Noise-based: VARLiNGAM
    • Score-based: DYNOTEARS, NTS-NOTEARS
    • Topology-based: TSCI
    • Deep learning-based: cMLP, cLSTM, CUTS, CUTS+

Datasets

The datasets/ directory contains sample datasets. Complete datasets can be generated using the provided scripts. For convenience and reproducibility, the complete datasets archive is publicly available at Google Drive.

The generated datasets follow the naming convention:

[scenario]_[params]_[model]_p[p]_T[T]_[optional]_seed[seed].npz

Example: confounder_rho0.5_VAR_p10_T1000_seed0.npz

Installation

Install with pip

# 1. Create a clean conda environment
conda create -n causalcompass-env python=3.10 -y
conda activate causalcompass-env

# 2. Install causalcompass from PyPI
pip install causalcompass

# 3. Verify installation
pip show causalcompass
python -c "import causalcompass; print(dir(causalcompass))"

Usage Example

from causalcompass.datasets.measurement_error import simulate_var_with_measure_error
from causalcompass.algorithms import PCMCI

# Step 1: Generate VAR data with measurement error
p, T, lag, seed = 10, 500, 3, 0
gamma = 1.2  # measurement error scale factor

data, beta, true_adj = simulate_var_with_measure_error(
    p=p, T=T, lag=lag, gamma=gamma, seed=seed
)
print(f"Data shape: {data.shape}")              # (500, 10)
print(f"Ground truth shape: {true_adj.shape}")  # (10, 10)

# Step 2: Initialize and run the algorithm
model = PCMCI(tau_max=3, pc_alpha=0.05, alpha=0.05)
predicted_adj = model.run(data)

# Step 3: Evaluate
all_metrics, no_diag_metrics = model.eval(
    true_adj,
    predicted_adj,
    shd_thresholds=[0, 0.01, 0.05, 0.1, 0.3],
)
print(f"AUROC: {all_metrics['auroc']:.3f}")
print(f"AUPRC: {all_metrics['auprc']:.3f}")
print(f"NSHD: {all_metrics['shd']:.3f}")  # shd stores normalized SHD (NSHD)
print(f"AUROC (no diag): {no_diag_metrics['auroc']:.3f}")
print(f"AUPRC (no diag): {no_diag_metrics['auprc']:.3f}")
print(f"NSHD (no diag): {no_diag_metrics['shd']:.3f}")  # shd stores normalized SHD (NSHD)

Citation

If you use this code or datasets in your research, please cite:

@misc{yi2026causalcompass,
  title   = {{CausalCompass}: Evaluating the Robustness of Time-Series Causal Discovery in Misspecified Scenarios},
  author  = {Yi, Huiyang and Shen, Xiaojian and Wu, Yonggang and Chen, Duxin and Wang, He and Yu, Wenwu},
  year    = {2026},
  note    = {Under review as a conference paper}
}

Note: The final bibliographic information (e.g., venue and proceedings details) will be updated upon paper acceptance.

License

  • The code in this repository is released under the MIT License.
  • The datasets generated and provided by this repository are released under the CC BY 4.0 License.

Contributing

Contributions are welcome! If you encounter bugs, have suggestions for improvements, or would like to extend CausalCompass with additional assumption-violation scenarios or evaluation protocols, please feel free to open an issue or submit a pull request.

Contact

For questions or issues, please:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

causalcompass-0.1.1.tar.gz (199.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

causalcompass-0.1.1-py3-none-any.whl (300.1 kB view details)

Uploaded Python 3

File details

Details for the file causalcompass-0.1.1.tar.gz.

File metadata

  • Download URL: causalcompass-0.1.1.tar.gz
  • Upload date:
  • Size: 199.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for causalcompass-0.1.1.tar.gz
Algorithm Hash digest
SHA256 89b8e0ebfeab8543942d0eeef49682da1bf5666ac4c39b189b6ee7a3f43e79c7
MD5 4a9ca860d8ba81ea1df7fb03a180dbdb
BLAKE2b-256 bb7f360d533a1b2749796a5251300a765e0552ec213d9c8118257d1c0bb34b44

See more details on using hashes here.

File details

Details for the file causalcompass-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: causalcompass-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 300.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for causalcompass-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f61c447cebb2be4b853af128588047809e112fbe5329237c879b56fcb017d970
MD5 ce100f9564089ab42876e2e335ace587
BLAKE2b-256 255cf7ba5c89ffd9201b211caaa58d3b0c3a0978438de169befe1efdeb73d151

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page