A framework for AI-driven digital twins for cognitive decline.
Project description
CognitiveTwin: AI-Driven Digital Twins for Personalized Cognitive Decline Prediction
CognitiveTwin is a state-of-the-art AI framework for creating personalized digital twins that predict cognitive decline trajectories in patients with Alzheimer's disease and related dementias. Our approach integrates multi-modal longitudinal data using transformer-based fusion and temporal dynamics modeling to achieve superior prediction accuracy with calibrated uncertainty quantification.
๐ฏ Key Results
- Superior Accuracy: MAE of 1.619 MMSE points (47.5% improvement over SOTA)
- Strong Temporal Dynamics: Rยฒ = 0.682 with excellent trend capture
- Excellent Discrimination: AUROC = 0.912 for AD progression prediction
- Well-Calibrated: ECE = 0.054 with reliable uncertainty estimates
- Fair Across Demographics: Balanced performance (male: 1.622, female: 1.614 MAE)
- Robust to Missing Data: Only 0.4% degradation under 15% missingness
- Clinical Utility: 46% reduction in number needed to screen
๐๏ธ Architecture Overview
graph TB
A[Multi-Modal Data] --> B[Transformer Fusion]
B --> C[Temporal Modeling]
C --> D[Uncertainty Quantification]
D --> E[Clinical Predictions]
A1[Cognitive Scores] --> B
A2[MRI Volumetrics] --> B
A3[PET/CSF Biomarkers] --> B
A4[Demographics/Genetics] --> B
C --> C1[GRU-based Dynamics]
C --> C2[Attention Mechanisms]
E --> E1[MMSE Trajectory]
E --> E2[95% Prediction Intervals]
E --> E3[Progression Risk]
๐ Repository Structure
cognitivedt/
โโโ cognitivedt/ # Core package
โ โโโ data/ # Data schemas and models
โ โโโ io/ # Data loading and preprocessing
โ โโโ representation/ # Multi-modal fusion models
โ โโโ dynamics/ # Temporal modeling components
โ โโโ evaluation/ # Metrics and validation
โ โโโ utils/ # Utilities and helpers
โโโ experiments/ # Experimental setup
โ โโโ scripts/ # Training and evaluation scripts
โ โ โโโ run_experiments_fixed.py # Main corrected experiments
โ โ โโโ generate_figures.py # Publication figure generation
โ โ โโโ generate_tables.py # LaTeX table generation
โ โโโ configs/ # Experiment configurations
โ โโโ notebooks/ # Jupyter analysis notebooks
โโโ docs/ # Documentation
โ โโโ paper/ # Research paper and manuscript
โ โโโ experiments/ # Experimental results and analysis
โ โ โโโ figures/ # Generated publication figures
โ โ โโโ tables/ # LaTeX tables for publication
โ โ โโโ results.md # Detailed quantitative results
โ โโโ technical_guide.md # Technical implementation guide
โ โโโ data_guide.md # Data acquisition and setup guide
โโโ tests/ # Unit and integration tests
โโโ pyproject.toml # Project configuration
โโโ README.md # This file
๐ Quick Start
Installation
Option 1: Install from PyPI (Recommended)
# Install the latest stable version
pip install cognitivedt
# Or install with all optional dependencies
pip install "cognitivedt[all]"
# For development features
pip install "cognitivedt[dev]"
Option 2: Install from Source
# Clone the repository
git clone https://github.com/bulentsoykan/cognitivedt.git
cd cognitivedt
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode
pip install -e ".[all]"
Data Setup
- Download TADPOLE Dataset: Visit TADPOLE Challenge and download
TADPOLE_D1_D2.csv - Place in Repository: Copy the file to the repository root directory
- Verify Installation: Run the data loader test
python -c "from cognitivedt.io.loader import TADPOLELoader; print('โ Installation successful')"
Reproduce Main Results
# Navigate to experiments directory
cd experiments/scripts
# Run corrected experiments (reproduces paper results)
python run_experiments_fixed.py
# Generate publication figures
python generate_figures.py
# Generate LaTeX tables
python generate_tables.py
Expected output:
๐ฏ CORRECTED EXPERIMENTS COMPLETED!
Best model (Transformer) MAE: 1.619
Results saved to: ../../docs/experiments/
๐ Key Features
Multi-Modal Data Integration
- Cognitive Assessments: MMSE, ADAS-Cog, CDR scores
- Neuroimaging: MRI volumetrics (hippocampus, ventricles, cortical thickness)
- Biomarkers: PET amyloid/tau, CSF Aฮฒ42/tau ratios
- Demographics & Genetics: Age, sex, education, APOE genotype
Advanced AI Architecture
- Transformer-Based Fusion: Cross-modal attention for optimal feature integration
- Temporal Dynamics: GRU-based modeling for disease progression patterns
- Uncertainty Quantification: Bayesian neural networks with calibrated confidence intervals
- Robust Training: Handles missing data and temporal distribution shifts
Clinical Validation
- TADPOLE Dataset: 1,666 patients, 12,505 longitudinal visits
- Rigorous Evaluation: Train/validation/test splits with temporal validation
- Fairness Assessment: Performance evaluated across demographic groups
- Clinical Metrics: Decision curve analysis and number needed to screen
๐ฌ Experimental Results
Model Performance Comparison
| Method | MAE โ | RMSE โ | Rยฒ โ | AUROC โ |
|---|---|---|---|---|
| LSTM | 3.420 | 4.680 | 0.220 | 0.730 |
| CNN-LSTM | 3.180 | 4.510 | 0.280 | 0.760 |
| Transformer | 2.940 | 4.230 | 0.350 | 0.780 |
| Graph Neural Net | 2.670 | 3.980 | 0.410 | 0.810 |
| CognitiveTwin (Ours) | 1.619 | 2.248 | 0.682 | 0.912 |
Fairness Analysis
| Demographic Group | MAE | AUROC | Performance Gap |
|---|---|---|---|
| Male | 1.622 | 0.920 | Reference |
| Female | 1.614 | 0.893 | -0.5% |
| Age < 65 | 1.608 | 0.925 | +1.4% |
| Age 65-75 | 1.619 | 0.912 | Reference |
| Age > 75 | 1.635 | 0.901 | -1.0% |
Robustness Evaluation
| Condition | MAE | Degradation |
|---|---|---|
| Complete Data | 1.619 | 0% |
| 15% Missing (MNAR) | 1.625 | +0.4% |
| 25% Missing | 1.741 | +7.5% |
| Temporal Shift | 1.687 | +4.2% |
๐ Usage Examples
Basic Model Training
from cognitivedt import TADPOLELoader
from cognitivedt.representation import TransformerFusionModel
from cognitivedt.dynamics import FixedCognitiveTwinModel
import torch
# Load data
loader = TADPOLELoader()
train_data, val_data, test_data = loader.load_and_split("TADPOLE_D1_D2.csv")
# Create model
model = FixedCognitiveTwinModel(
input_dim=12, # Multi-modal feature dimension
hidden_dim=256, # Hidden layer size
n_layers=4, # Number of transformer layers
n_heads=8 # Number of attention heads
)
# Train model
optimizer = torch.optim.AdamW(model.parameters(), lr=1e-4)
# ... training loop implementation
Prediction with Uncertainty
# Make predictions with uncertainty quantification
predictions, uncertainties = model.predict_with_uncertainty(test_data)
# Extract prediction intervals
mean_pred = predictions.mean(dim=0)
lower_bound = predictions.quantile(0.025, dim=0)
upper_bound = predictions.quantile(0.975, dim=0)
print(f"Predicted MMSE: {mean_pred:.1f}")
print(f"95% CI: [{lower_bound:.1f}, {upper_bound:.1f}]")
๐ Documentation
API Documentation
Generate local documentation:
pip install sphinx sphinx-rtd-theme
cd docs
make html
open _build/html/index.html
๐งช Testing
# Run all tests
pytest
# Run with coverage
pytest --cov=cognitivedt --cov-report=html
# Run specific test modules
pytest tests/test_models.py
pytest tests/test_data.py
๐ค Contributing
We welcome contributions! Please see our Contributing Guidelines for details.
Development Setup
# Install development dependencies
pip install -e ".[dev]"
# Set up pre-commit hooks
pre-commit install
# Run code quality checks
ruff check cognitivedt/
black cognitivedt/
mypy cognitivedt/
๐ Citation
If you use CognitiveTwin in your research, please cite our paper:
@article{cognitivetwin2025,
title={CognitiveTwin: AI-Driven Digital Twins for Personalized Cognitive Decline Prediction},
author={Soykan, Bulent},
year={2026},
url={https://github.com/bulentsoykan/cognitivedt}
}
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Acknowledgments
- TADPOLE Challenge for providing the standardized evaluation framework
- ADNI Consortium for the longitudinal neuroimaging data
- NIH/NIA for research funding support
- PyTorch Team for the deep learning framework
๐ง Contact
For questions about the research or implementation:
- Technical Issues: Open a GitHub issue
- Research Collaboration: Contact soykanb@gmail.com
- Author Website: bulentsoykan.com
- Professional Profile: LinkedIn
- Technical Blog: bulentsoykan.github.io
โญ Star this repository if you find CognitiveTwin useful for your research!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cognitive_digital_twin-0.1.2.tar.gz.
File metadata
- Download URL: cognitive_digital_twin-0.1.2.tar.gz
- Upload date:
- Size: 1.9 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e04f779361e0bf268340e5efb2004f966c25bded73d2e8f7685b8803b7c08a39
|
|
| MD5 |
e7a63bf9c4904f6eaad4075cc11fb806
|
|
| BLAKE2b-256 |
c5890e35916aca0026713cd7b7d6e1c80bcfdd628e1f8448b6e82a8b7eb3487f
|
File details
Details for the file cognitive_digital_twin-0.1.2-py3-none-any.whl.
File metadata
- Download URL: cognitive_digital_twin-0.1.2-py3-none-any.whl
- Upload date:
- Size: 63.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7d99257ca6d452b74b75dec072741603e48c48ff1d5a3f7156f73f2413cbedbc
|
|
| MD5 |
ca26bb7b926cb892a71d5532fa892461
|
|
| BLAKE2b-256 |
ce89e1c2abecb6a972ccfb84de87f1e86ba183986035e97cdf9369bb4614d09e
|