Skip to main content

Evaluating the Robustness of Time-Series Causal Discovery in Misspecified Scenarios

Project description

CausalCompass: Evaluating the Robustness of Time-Series Causal Discovery in Misspecified Scenarios

arXiv Documentation PyPI License: MIT

CausalCompass is a Python package that provides a flexible and extensible benchmark suite for evaluating the robustness of time-series causal discovery (TSCD) methods under misspecified modeling assumptions. For more details, please refer to the Document.

AUPRC Performance Radar Plot


Abstract

Causal discovery from time series is a fundamental task in machine learning. However, its widespread adoption is hindered by a reliance on untestable causal assumptions and by the lack of robustness-oriented evaluation in existing benchmarks. To address these challenges, we propose CausalCompass, a flexible and extensible benchmark suite designed to assess the robustness of time-series causal discovery (TSCD) methods under violations of modeling assumptions. To demonstrate the practical utility of CausalCompass, we conduct extensive benchmarking of representative TSCD algorithms across eight assumption-violation scenarios. Our experimental results indicate that no single method consistently attains optimal performance across all settings. Nevertheless, the methods exhibiting superior overall performance across diverse scenarios are almost invariably deep learning-based approaches. We further provide hyperparameter sensitivity analyses to deepen the understanding of these findings. We also find, somewhat surprisingly, that NTS-NOTEARS relies heavily on standardized preprocessing in practice, performing poorly in the vanilla setting but exhibiting strong performance after standardization. Finally, our work aims to provide a comprehensive and systematic evaluation of TSCD methods under assumption violations, thereby facilitating their broader adoption in real-world applications.

Key Features

  • 8 assumption-violation scenarios: Confounders, nonstationarity, measurement error, standardization, missing data, mixed data, min-max normalization, and trend/seasonality
  • 2 vanilla models: VAR (linear) and Lorenz-96 (nonlinear)
  • 11 TSCD algorithms spanning 6 major methodological categories:
    • Granger causality-based: VAR, LGC
    • Constraint-based: PCMCI
    • Noise-based: VARLiNGAM
    • Score-based: DYNOTEARS, NTS-NOTEARS
    • Topology-based: TSCI
    • Deep learning-based: cMLP, cLSTM, CUTS, CUTS+

Datasets

The datasets/ directory contains sample datasets. Complete datasets can be generated using the provided scripts. For convenience and reproducibility, the complete datasets archive is publicly available at Google Drive.

The generated datasets follow the naming convention:

[scenario]_[params]_[model]_p[p]_T[T]_[optional]_seed[seed].npz

Example: confounder_rho0.5_VAR_p10_T1000_seed0.npz

Installation

Install with pip

# 1. Create a clean conda environment
conda create -n causalcompass-env python=3.10 -y
conda activate causalcompass-env

# 2. Install causalcompass from PyPI
pip install causalcompass

# 3. Verify installation
pip show causalcompass
python -c "import causalcompass; print(dir(causalcompass))"

Usage Example

from causalcompass.datasets.measurement_error import simulate_var_with_measure_error
from causalcompass.algorithms import PCMCI

# Step 1: Generate VAR data with measurement error
p, T, lag, seed = 10, 500, 3, 0
gamma = 1.2  # measurement error scale factor

data, beta, true_adj = simulate_var_with_measure_error(
    p=p, T=T, lag=lag, gamma=gamma, seed=seed
)
print(f"Data shape: {data.shape}")              # (500, 10)
print(f"Ground truth shape: {true_adj.shape}")  # (10, 10)

# Step 2: Initialize and run the algorithm
model = PCMCI(tau_max=3, pc_alpha=0.05, alpha=0.05)
predicted_adj = model.run(data)

# Step 3: Evaluate
all_metrics, no_diag_metrics = model.eval(
    true_adj,
    predicted_adj,
    shd_thresholds=[0, 0.01, 0.05, 0.1, 0.3],
)
print(f"AUROC: {all_metrics['auroc']:.3f}")
print(f"AUPRC: {all_metrics['auprc']:.3f}")
print(f"NSHD: {all_metrics['shd']:.3f}")  # shd stores normalized SHD (NSHD)
print(f"AUROC (no diag): {no_diag_metrics['auroc']:.3f}")
print(f"AUPRC (no diag): {no_diag_metrics['auprc']:.3f}")
print(f"NSHD (no diag): {no_diag_metrics['shd']:.3f}")  # shd stores normalized SHD (NSHD)

Citation

If you use this code or datasets in your research, please cite:

@misc{yi2026causalcompass,
  title   = {{CausalCompass}: Evaluating the Robustness of Time-Series Causal Discovery in Misspecified Scenarios},
  author  = {Yi, Huiyang and Shen, Xiaojian and Wu, Yonggang and Chen, Duxin and Wang, He and Yu, Wenwu},
  year    = {2026},
  note    = {Under review as a conference paper}
}

Note: The final bibliographic information (e.g., venue and proceedings details) will be updated upon paper acceptance.

License

  • The code in this repository is released under the MIT License.
  • The datasets generated and provided by this repository are released under the CC BY 4.0 License.

Contributing

Contributions are welcome! If you encounter bugs, have suggestions for improvements, or would like to extend CausalCompass with additional assumption-violation scenarios or evaluation protocols, please feel free to open an issue or submit a pull request.

Contact

For questions or issues, please:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

causalcompass-0.1.0.tar.gz (200.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

causalcompass-0.1.0-py3-none-any.whl (300.1 kB view details)

Uploaded Python 3

File details

Details for the file causalcompass-0.1.0.tar.gz.

File metadata

  • Download URL: causalcompass-0.1.0.tar.gz
  • Upload date:
  • Size: 200.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for causalcompass-0.1.0.tar.gz
Algorithm Hash digest
SHA256 3a3dfa03f9cd3b0d43490935b085ef6c5756aaf3e96d8866ba4a622dd9e9c29d
MD5 d4b775e49dfc61d10c68c1d8c85e22c8
BLAKE2b-256 efa46fc540a648c338176fdb4ddf207b908d79653be249b59479ccfdd794330d

See more details on using hashes here.

File details

Details for the file causalcompass-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: causalcompass-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 300.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for causalcompass-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 85bb85386af54523bfa8530181dd958280634d72e959373eb2bf7f25cec6eeac
MD5 30e3c9b648675b78d2744db3d7e40741
BLAKE2b-256 c7691d59d6b40f0424056402d6b86050de5c6490ed64bdd80ed5e4bb47544883

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page