Skip to main content

An efficient, accurate and flexible method for single-cell data integration.

Project description

Portal

DOI

Adversarial domain translation networks for integrating large-scale atlas-level single-cell datasets

An efficient, accurate and flexible method for single-cell data integration.

Check out our manuscript in Nature Computational Science:

Reproducibility

We provide source codes for reproducing the experiments of the paper "Adversarial domain translation networks for fast and accurate integration of large-scale atlas-level single-cell datasets".

Installation

To run Portal, please follow the installation instruction:

git clone https://github.com/YangLabHKUST/Portal.git
cd Portal
conda env update --f environment.yml
conda activate portal

Normally the installation time is less than 5 minutes.

Quick Start

Basic Usage

Starting with raw count matrices formatted as AnnData objects, Portal uses a standard pipline adopted by Seurat and Scanpy to preprocess data, followed by PCA for dimensionality reduction. After preprocessing, Portal can be trained via model.train().

import portal
import scanpy as sc

# read AnnData
adata_1 = sc.read_h5ad("adata_1.h5ad")
adata_2 = sc.read_h5ad("adata_2.h5ad")

model = portal.model.Model()
model.preprocess(adata_1, adata_2) # perform preprocess and PCA
model.train() # train the model
model.eval() # get integrated latent representation of cells

The evaluating procedure model.eval() saves the integrated latent representation of cells in model.latent, which can be used for downstream integrative analysis.

Parameters in portal.model.Model():

  • lambdacos: Coefficient of the regularizer for preserving cosine similarity across domains. Default: 20.0.
  • training_steps: Number of steps for training. Default: 2000. Use training_steps=1000 for datasets with sample size < 20,000.
  • npcs: Dimensionality of the embeddings in each domain (number of PCs). Default: 30.
  • n_latent: Dimensionality of the shared latent space. Default: 20.
  • batch_size: Batch size for training. Default: 500.
  • seed: Random seed. Default: 1234.

The default setting of the parameter lambdacos works in general. We also enable tuning of this parameter to achieve a better performance, see Tuning lambdacos (optional). For the integration task where the cosine similarity is not a reliable cross-domain correspondance (such as cross-species integration), we recommend to use a lower value such as lambdacos=10.0.

Memory-efficient Version

To deal with large single-cell datasets, we also developed a memory-efficient version by reading mini-batches from the disk:

model = portal.model.Model()
model.preprocess_memory_efficient(adata_A_path="adata_1.h5ad", adata_B_path="adata_2.h5ad")
model.train_memory_efficient()
model.eval_memory_efficient()

Integrating Multiple Datasets

Portal integrates multiple datasets incrementally. Given adata_list = [adata_1, ..., adata_n] is a list of AnnData objects, they can be integrated by running the following commands:

lowdim_list = portal.utils.preprocess_datasets(adata_list)
integrated_data = portal.utils.integrate_datasets(lowdim_list)

Tuning lambdacos (optional)

An optional choice is to tune the parameter lambdacos in the range [15.0, 50.0]. Users can run the following command to search for an optimal parameter that yields the best integration result in terms of the mixing metric:

lowdim_list = portal.utils.preprocess_datasets(adata_list)
integrated_data = portal.utils.integrate_datasets(lowdim_list, search_cos=True)

Recovering expression matrices

Portal can provide harmonized expression matrices (in scaled level or log-normalized level):

lowdim_list, hvg, mean, std, pca = portal.utils.preprocess_recover_expression(adata_list)
expression_scaled, expression_log_normalized = portal.utils.integrate_recover_expression(lowdim_list, mean, std, pca)

Demos

We provide demos for users to get a quick start: Demo 1, Demo 2.

Development

This package is developed by Jia Zhao (jzhaoaz@connect.ust.hk) and Gefei Wang (gwangas@connect.ust.hk).

Citation

Jia Zhao, Gefei Wang, Jingsi Ming, Zhixiang Lin, Yang Wang, The Tabula Microcebus Consortium, Angela Ruohao Wu, Can Yang. Adversarial domain translation networks for integrating large-scale atlas-level single-cell datasets. Nature Computational Science 2, 317–330 (2022).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

portal-sc-1.0.2.tar.gz (17.3 kB view details)

Uploaded Source

File details

Details for the file portal-sc-1.0.2.tar.gz.

File metadata

  • Download URL: portal-sc-1.0.2.tar.gz
  • Upload date:
  • Size: 17.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.6

File hashes

Hashes for portal-sc-1.0.2.tar.gz
Algorithm Hash digest
SHA256 7e6e3dcc73a52a6366088d229c44d28af2c33a145d5ffe2be3c7fbedbb29a36a
MD5 9b5a615fca76843534e9600ed29ef6ed
BLAKE2b-256 e198fb39e65fa57881afbfcba3c8899b9853f77884c54aab7705085ee746c2ea

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page