LAIOR: Lorentz Attentive Interpretable ODE Regularized VAE for single-cell omics analysis
Project description
LAIOR: Lorentz Attentive Interpretable ODE Regularized VAE
A deep learning framework for single-cell omics analysis combining geometric manifold learning, latent dynamics, and comprehensive interpretability.
LAIOR (Lorentz Attentive Interpretable ODE Regularized VAE) is a PyTorch-based unified framework for single-cell RNA-seq and ATAC-seq analysis (AnnData format) that learns low-dimensional embeddings from raw count matrices using count-likelihood objectives (NB/ZINB/Poisson/ZIP). It integrates (1) Lorentz geometric regularization for hierarchical structure, (2) dual-path information bottleneck architecture for coordinated biological programs, (3) Neural ODE regularization for temporal trajectory stability, and (4) transformer-based attention mechanisms for capturing long-range dependencies.
Key Features
- Advanced VAE Architecture: Dimensionality reduction with count-based likelihoods (NB, ZINB, Poisson, ZIP)
- Geometric Manifold Learning: Lorentz (hyperbolic) or Euclidean regularization for hierarchical structure
- Neural ODE Trajectories: Latent dynamics solved with
torchdiffeq(CPU by design). Supports ODE function typeslegacy,time_mlp(time-conditioned), andgru. - Comprehensive Interpretability: Attribution analysis for Genes → Latents (discriminative) and Latents → Genes (reconstructive) pathways.
- Flexible Encoders: Standard MLP or Transformer-based with self-attention for capturing complex feature dependencies.
- Information Bottleneck: Hierarchical representation with controllable compression via the
i_dimparameter.
Data Requirements
adata.layers[layer]must contain raw, non-negative integer-like counts (UMI counts).
LAIOR checks this heuristically and raises aValueErrorif the layer looks normalized/log-transformed.- LAIOR applies its own
log1p+ clipping / adaptive normalization internally for training.
Installation
pip install liora
Or install from source:
git clone https://github.com/PeterPonyu/Liora.git
cd Liora
pip install -e .
Quick Start
Basic Usage
import scanpy as sc
from liora import LAIOR
# Load your data
adata = sc.read_h5ad('data.h5ad')
# Train with default settings
model = LAIOR(
adata,
layer='counts',
hidden_dim=128,
latent_dim=10,
i_dim=2,
)
model.fit(epochs=100)
# Extract embeddings
latent = model.get_latent()
Advanced Configuration
# Transformer encoder + Neural ODE trajectory inference
model = LAIOR(
adata,
layer='counts',
hidden_dim=128,
latent_dim=10,
i_dim=2,
# Encoder configuration
encoder_type='transformer',
attn_embed_dim=64,
attn_num_heads=4,
attn_num_layers=2,
attn_seq_len=32,
# ODE configuration
use_ode=True,
ode_type='time_mlp',
ode_time_cond='concat',
ode_hidden_dim=64,
ode_solver_method='dopri5',
ode_rtol=1e-5,
ode_atol=1e-7,
# Loss weights
lorentz=5.0,
beta=1.0,
)
model.fit(epochs=200, patience=20)
# Extract results
latent = model.get_latent() # Latent embeddings
bottleneck = model.get_bottleneck() # Information bottleneck
pseudotime = model.get_time() # Predicted pseudotime
transitions = model.get_transition() # Transition matrix
Note:
get_time()andget_transition()requireuse_ode=True.
Configuration Guide
Architecture Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
hidden_dim |
int | 128 | Hidden layer dimension |
latent_dim |
int | 10 | Primary latent space size |
i_dim |
int | 2 | Information bottleneck dimension |
Encoder Options
| Parameter | Type | Default | Description |
|---|---|---|---|
encoder_type |
str | 'mlp' |
'mlp' or 'transformer' |
attn_embed_dim |
int | 64 | Transformer embedding dimension |
attn_num_heads |
int | 4 | Number of attention heads |
attn_num_layers |
int | 2 | Transformer encoder layers |
attn_seq_len |
int | 32 | Token sequence length |
ODE Configuration
| Parameter | Type | Default | Description |
|---|---|---|---|
use_ode |
bool | False | Enable Neural ODE |
ode_type |
str | 'time_mlp' |
'legacy', 'time_mlp', or 'gru' |
ode_time_cond |
str | 'concat' |
'concat', 'film', or 'add' |
ode_hidden_dim |
int | None | ODE network hidden size |
ode_solver_method |
str | 'rk4' |
Solver: 'rk4', 'dopri5', 'adams', etc. |
ode_step_size |
float | None | Fixed-step size or 'auto' |
ode_rtol |
float | None | Relative tolerance (adaptive solvers) |
ode_atol |
float | None | Absolute tolerance (adaptive solvers) |
Loss Configuration
| Parameter | Type | Default | Description |
|---|---|---|---|
recon |
float | 1.0 | Reconstruction loss weight |
irecon |
float | 0.0 | Bottleneck reconstruction weight |
lorentz |
float | 0.0 | Manifold regularization weight |
beta |
float | 1.0 | KL divergence weight (β-VAE) |
dip |
float | 0.0 | DIP-VAE loss weight |
tc |
float | 0.0 | Total Correlation loss weight |
info |
float | 0.0 | MMD loss weight |
loss_type |
str | 'nb' |
'nb', 'zinb', 'poisson', or 'zip' |
Training Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
lr |
float | 1e-4 | Learning rate |
batch_size |
int | 128 | Mini-batch size |
train_size |
float | 0.7 | Training set proportion |
val_size |
float | 0.15 | Validation set proportion |
test_size |
float | 0.15 | Test set proportion |
Methods
Training
model.fit(epochs=400, patience=25, val_every=5, early_stop=True)
Extraction
latent = model.get_latent() # Latent representations
bottleneck = model.get_bottleneck() # Information bottleneck embeddings
pseudotime = model.get_time() # Pseudotime (ODE mode)
transitions = model.get_transition() # Transition probabilities (ODE mode)
Citation
If you use Liora in your research, please cite:
@software{laior2025,
title = {LAIOR: Lorentz Attentive Interpretable ODE Regularized VAE},
author = {Zeyu Fu and Jiawei Fu and Chunlin Chen and Keyang Zhang and Junping Wang and Song Wang},
year = {2025},
url = {https://github.com/PeterPonyu/Liora}
}
License
MIT License - see LICENSE file for details.
Contributing
Contributions are welcome! See CONTRIBUTING.md for guidelines.
Support
- Issues: GitHub Issues
- Email: fuzeyu99@126.com
- Documentation: GitHub Repository
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file laior-0.6.0.tar.gz.
File metadata
- Download URL: laior-0.6.0.tar.gz
- Upload date:
- Size: 101.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bc6a77a6c2d1c38c3d5f673d8a762598310d9fdc257ddf1d47b59a19400e5bde
|
|
| MD5 |
20b68eb0912fb1a34f0f84d8d77002c9
|
|
| BLAKE2b-256 |
837883a2cb78e466ee237582a55848ce6e3dd4dd4dfdd8760683814f6a8d7def
|
File details
Details for the file laior-0.6.0-py3-none-any.whl.
File metadata
- Download URL: laior-0.6.0-py3-none-any.whl
- Upload date:
- Size: 122.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f60f8bb4f80b64d5237c6a0f508264cf51512c4d7354d21d694bb110d3464580
|
|
| MD5 |
66f1a849d538dfd5ffaba529bcb5bc80
|
|
| BLAKE2b-256 |
efe442285dc1699c49808e75075c3923269727480f5af0b9acc2b66127b89e43
|