Momentum Contrast ODE-Regularized VAE for Single-Cell RNA Velocity
Project description
MoCoO
Momentum Contrast ODE-Regularized VAE for Single-Cell RNA Velocity
A unified deep learning framework combining Variational Autoencoders (VAE), Neural Ordinary Differential Equations (ODE), and Momentum Contrast (MoCo) for robust single-cell trajectory inference and representation learning.
Features
- VAE-based dimensionality reduction with multiple count-based likelihoods (MSE, NB, ZINB, Poisson, ZIP)
- Neural ODE for continuous trajectory modeling and pseudotime inference
- Momentum Contrast (MoCo) for robust contrastive representation learning
- Information bottleneck for hierarchical feature extraction
- Disentanglement losses (DIP-VAE, β-TC-VAE, InfoVAE) for interpretable latents
- Vector field analysis for RNA velocity visualization
Installation
From PyPI (recommended)
pip install mocoo
From source
git clone https://github.com/PeterPonyu/MoCoO.git
cd MoCoO
pip install -e .
Development installation
git clone https://github.com/PeterPonyu/MoCoO.git
cd MoCoO
pip install -e ".[dev]"
Publishing
The package is automatically published to PyPI when a GitHub release is created.
To create a new release:
-
Bump version:
python release.py patch # For bug fixes (0.0.1 → 0.0.2) python release.py minor # For new features (0.0.1 → 0.1.0) python release.py major # For breaking changes (0.0.1 → 1.0.0)
-
Commit and push:
git add -A git commit -m "Bump version to X.Y.Z" git push
-
Create GitHub release:
- Go to Releases
- Click "Create a new release"
- Tag:
vX.Y.Z(e.g.,v0.1.0) - Title:
Release X.Y.Z - Description: List changes
- Click "Publish release"
-
Automated publishing:
- GitHub Actions will automatically build and publish to PyPI
- Check the Actions tab for build status
Quick Start
Basic VAE
import scanpy as sc
from mocoo import MoCoO
adata = sc.read_h5ad('data.h5ad')
model = MoCoO(
adata,
layer='counts',
loss_mode='nb',
batch_size=128
)
model.fit(epochs=100)
latent = model.get_latent()
adata.obsm['X_mocoo'] = latent
With ODE + MoCo
model = MoCoO(
adata,
use_ode=True,
use_moco=True,
latent_dim=10,
i_dim=2,
moco_K=4096,
aug_prob=0.5,
batch_size=256
)
model.fit(epochs=400, patience=25)
latent = model.get_latent()
velocity = model.get_velocity()
pseudotime = model.get_time()
transition = model.get_transition(top_k=30)
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
adata |
AnnData | required | Annotated data matrix |
layer |
str | 'counts' |
Layer containing raw counts |
loss_mode |
str | 'nb' |
Likelihood: 'mse', 'nb', 'zinb', 'poisson', 'zip' |
latent_dim |
int | 10 |
Latent space dimension |
i_dim |
int | 2 |
Bottleneck dimension (< latent_dim) |
use_ode |
bool | False |
Enable Neural ODE |
use_moco |
bool | False |
Enable MoCo |
moco_K |
int | 4096 |
MoCo queue size |
batch_size |
int | 128 |
Mini-batch size |
lr |
float | 1e-4 |
Learning rate |
See docstrings for complete parameter list.
API
Training
model.fit(epochs=400, patience=25, val_every=5)
Inference
latent = model.get_latent() # Latent embeddings
bottleneck = model.get_bottleneck() # Bottleneck features
time = model.get_time() # Pseudotime (ODE only)
velocity = model.get_velocity() # RNA velocity (ODE only)
transition = model.get_transition() # Transition matrix (ODE only)
Metrics
loss_hist = model.get_loss_history()
metrics_hist = model.get_metrics_history()
resources = model.get_resource_metrics()
Architecture
Input (n_genes)
↓
Encoder (log1p → MLP → latent_dim)
↓
[Optional ODE] Neural ODE dynamics
↓
Bottleneck (latent_dim → i_dim → latent_dim)
↓
Decoder (MLP → n_genes)
↓
Reconstruction (NB/ZINB/MSE/Poisson/ZIP)
[Optional MoCo] Contrastive learning on augmented views
Loss Functions
- Reconstruction: MSE, NB, ZINB, Poisson, ZIP
- KL Divergence: β-weighted regularization
- Disentanglement: DIP-VAE, β-TC-VAE, InfoVAE (MMD)
- ODE Regularization: MSE between VAE and ODE latents
- MoCo Contrastive: InfoNCE loss
Validation Metrics
- ARI: Adjusted Rand Index
- NMI: Normalized Mutual Information
- ASW: Silhouette Score
- CH: Calinski-Harabasz Index
- DB: Davies-Bouldin Index
- Corr: Latent correlation
Citation
@article{mocoo2025,
title={MoCoO: Momentum Contrast ODE-Regularized VAE for Single-Cell Trajectory Inference},
author={Ponyu, Peter},
year={2025}
}
License
MIT License
Contact
GitHub: @PeterPonyu
Repository: MoCoO
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file mocoo-0.0.3.tar.gz.
File metadata
- Download URL: mocoo-0.0.3.tar.gz
- Upload date:
- Size: 22.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6070bf0c5b55a0dc90ff2c4f6a75f5af7287dd1fda57b7c2c71cbbb7f278aa41
|
|
| MD5 |
f1eb830f797bfd178294062103636f8b
|
|
| BLAKE2b-256 |
b6b6bd0d88258ece64d20f3679a9eb821c199608f4954ff7fbd5fb061244246c
|
File details
Details for the file mocoo-0.0.3-py3-none-any.whl.
File metadata
- Download URL: mocoo-0.0.3-py3-none-any.whl
- Upload date:
- Size: 21.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a74c2caf729dd28689dbfa5cb7b627713be9d4284be7693346e615b5899c7cdb
|
|
| MD5 |
17fc5ab5b9af448de6a9a7c36c35a036
|
|
| BLAKE2b-256 |
a89f0bc4f9ce878cf6c9dc3b5581804a751761df61b08953b1a8a6231b2690e3
|