Skip to main content

Dynamic feature distillation framework for robust zero-shot LLM annotation.

Project description

LLM-scCurator

LLM-scCurator Dynamic feature masking to improve robustness of zero-shot cell-type annotation with LLMs.

Docs DOI License: MIT Python 3.9+ R Jupyter Docker


🚀 Overview

LLM-scCurator is a Large Language Model–based curator for single-cell and spatial transcriptomics. It performs noise-aware marker distillation—suppressing technical programs (e.g., ribosomal/mitochondrial), clonotype signals (TCR/Ig), and stress signatures while rescuing lineage markers—and applies leakage-safe lineage filters before prompting an LLM. It supports hierarchical (coarse-to-fine) annotation for single-cell and spatial transcriptomics data. See the documentation for full tutorials and API reference: https://llm-sccurator.readthedocs.io/

Key Features

  • 🛡️ Noise-aware filtering: Automatically removes lineage-specific noise (TCR/Ig) and state-dependent noise (ribosomal/mitochondrial).
  • 🧠 Context-aware inference: Automatically infers lineage context (e.g., "T cell") to guide LLM reasoning.
  • 🔬 Hierarchical discovery: One-line function to dissect complex tissues into major lineages and fine-grained subtypes.
  • 🌍 Spatial ready: Validated on scRNA-seq (10x) and spatial transcriptomics (Xenium, Visium).

📦 Installation

  • Option A (recommended): Install from PyPI

    pip install "llm-sc-curator[gemini]"
    # or: pip install "llm-sc-curator[openai]"
    # or: pip install "llm-sc-curator[all]"
    

    Notes:

    PyPI release coming with v0.1.0.

  • Option B: Install from GitHub (development)

    # 1. Clone the repository
    git clone https://github.com/kenflab/LLM-scCurator.git
    
    # 2. Navigate to the directory
    cd LLM-scCurator
    
    # 3. Install the package (and dependencies)
    pip install .
    

Notes:

If you already have a Scanpy/Seurat pipeline environment, you can install it into that environment.


🐳 Docker (official environment)

We provide an official Docker environment (Python + R + Jupyter), sufficient to run LLM-scCurator and most paper figure generation.

  • Option A: Prebuilt image (recommended)

    Use the published image from GitHub Container Registry (GHCR).

    # from the repo root (optional, for notebooks / file access)
    docker pull ghcr.io/kenflab/llm-sc-curator:official
    

    Run Jupyter:

    docker run --rm -it \
      -p 8888:8888 \
      -v "$PWD":/work \
      -e GEMINI_API_KEY \
      -e OPENAI_API_KEY \
      ghcr.io/kenflab/llm-sc-curator:official
    

    Open Jupyter: http://localhost:8888

    (Use the token printed in the container logs.)

    Notes:

    For manuscript reproducibility, we also provide versioned tags (e.g., :v0.1.0). Prefer a version tag when matching a paper release.

  • Option B: Build locally (development)

    # Option B1: Build locally with Compose
    # from the repo root
    docker compose -f docker/docker-compose.yml build
    docker compose -f docker/docker-compose.yml up
    

    Open Jupyter: http://localhost:8888

    Workspace mount: /work

    # Option B2: Build locally without Compose (alternative)
    # from the repo root
    docker build -f docker/Dockerfile -t llm-sc-curator:official .
    

    Run Jupyter:

    docker run --rm -it \
      -p 8888:8888 \
      -v "$PWD":/work \
      -e GEMINI_API_KEY \
      -e OPENAI_API_KEY \
      llm-sc-curator:official
    

    Open Jupyter: http://localhost:8888


🖥️ Apptainer / Singularity (HPC)

  • Option A: Prebuilt image (recommended)

    Use the published image from GitHub Container Registry (GHCR).

    apptainer build llm-sc-curator.sif docker://ghcr.io/kenflab/llm-sc-curator:official
    
  • Option B: a .sif from the Docker image (development)

    docker compose -f docker/docker-compose.yml build
    apptainer build llm-sc-curator.sif docker-daemon://llm-sc-curator:official
    

Run Jupyter (either image):

apptainer exec --cleanenv \
  --bind "$PWD":/work \
  llm-sc-curator.sif \
  bash -lc 'jupyter lab --ip=0.0.0.0 --port=8888 --no-browser 

🔒 Privacy

We respect the sensitivity of clinical and biological data. LLM-scCurator is architected to ensure that raw expression matrices and cell-level metadata never leave your local environment.

  • Local execution: All heavy lifting—preprocessing, confounding gene removal, and feature ranking—occurs locally on your machine.
  • Minimal transmission: When interfacing with external LLM APIs, the system transmits only anonymized, cluster-level marker lists (e.g., top 50 ranked gene symbols) and basic tissue context.
  • User control: You retain control over any additional background information (e.g., disease state, treatment conditions, and platform) provided via custom prompts. Please review your institution’s data policy and the LLM provider’s terms before sending any information to external LLM APIs.

⚡ Quick Start

🐍 For Python / Scanpy Users

  1. Set your API key (simplest: paste in the notebook)
GEMINI_API_KEY = "PASTE_YOUR_KEY_HERE"
# OPENAI_API_KEY = "PASTE_YOUR_KEY_HERE"  # optional
  1. Run LLM-scCurator
import scanpy as sc
from llm_sc_curator import LLMscCurator

# Initialize with your API Key (Google AI Studio)
curator = LLMscCurator(api_key=GEMINI_API_KEY)

# Load your data
adata = sc.read_h5ad("my_data.h5ad")

# Run fully automated hierarchical annotation
adata = curator.run_hierarchical_discovery(adata)

# Visualize
sc.pl.umap(adata, color=['major_type', 'fine_type'])

📊 For R / Seurat Users

You can use LLM-scCurator in two ways:

  • Option A (recommended): Export → run in Python We provide a helper script examples/R/export_to_curator.R to export your Seurat object seamlessly for processing in Python.

    source("examples/R/export_to_curator.R")
    Rscript examples/R/export_to_curator.R \
      --in_rds path/to/seurat_object.rds \
      --outdir out_seurat \
      --cluster_col seurat_clusters
    

    Output:

    • counts.mtx (raw counts; recommended)
    • features.tsv (gene list)
    • obs.csv (cell metadata; includes seurat_clusters)
    • umap.csv (optional, if available)

    Notes:

    • The folder will contain: counts.mtx, features.tsv, obs.csv (and umap.csv if available).
    • Then continue in the Python/Colab tutorial to run LLM-scCurator and write cluster_curated_map.csv,
    • which can be re-imported into Seurat for plotting.
  • Option B: Run from R via reticulate (advanced)

    If you prefer to stay in R, you can invoke the Python package via reticulate (Python-in-R). This is more sensitive to Python environment configuration, so we recommend Option A for most users.


📄 Manuscript reproduction

For manuscript-facing verification (benchmarks, figures, and Source Data), use the versioned assets under paper/. See paper/README.md for the primary instructions.

Notes:

  • Figures are supported by exported Source Data in paper/source_data/ (see paper/FIGURE_MAP.csv for panel → file mapping).
  • Re-running LLM/API calls or external reference annotators is optional; LLM API outputs may vary across runs even with temperature=0.
  • For transparency, we include read-only provenance notebooks with example run logs in paper/notebooks/

📓 Colab notebooks

  • Python / Scanpy quickstart (recommended: colab_quickstart.ipynb)

    • Open In Colab
      ☝️ Runs end-to-end on a public Scanpy dataset (no API key required by default).

      • 🔑 Optional: If an API key is provided (replace GEMINI_API_KEY = "YOUR_KEY_HERE"), the notebook can also run LLM-scCurator automatic hierarchical cell annotation.
    • OpenAI quickstart (OpenAI backend: colab_quickstart_openai.ipynb)

    • Open In Colab
      ☝️ Same workflow as the Python / Scanpy quickstart, but configured for the OpenAI backend.

      • 🔑 Optional: If an API key is provided (replace OPENAI_API_KEY= "YOUR_KEY_HERE"), the notebook can also run LLM-scCurator automatic hierarchical cell annotation. OPENAI_API_KEY requires OpenAI API billing (paid API credits).
  • R / Seurat quickstart (export → Python LLM-scCurator → back to Seurat: colab_quickstart_R.ipynb)

    • Open In Colab
      ☝️ Runs a minimal Seurat workflow in R, exports a Seurat object to an AnnData-ready folder, runs LLM-scCurator in Python, then re-imports labels into Seurat for visualization and marker sanity checks.
      • 🔑 Optional: Requires an API key for LLM-scCurator annotation (same setup as above).
      • Recommended for Seurat users who want to keep Seurat clustering/UMAP but use LLM-scCurator for robust marker distillation and annotation.

🔑 Backends (LLM API keys) Setup

Set your provider API key as an environment variable:

  • GEMINI_API_KEY for Google Gemini
  • OPENAI_API_KEY for OpenAI API

See each provider’s documentation for how to obtain an API key and for current usage policies. Get API Key GIF

  • Option A (Gemini steps):

    1. Go to Google AI Studio.
    2. Log in with your Google Account.
    3. Click Get API key (top-left) $\rightarrow$ Create API key.
    4. Copy the key and use it in your code.
  • Option B (OpenAI steps):

    1. Go to OpenAI Platform.
    2. Log in with your OpenAI Account.
    3. Click Create new secret key $\rightarrow$ Create secret key.
    4. Copy the key and use it in your code.

Notes:

Google Gemini can be used within its free-tier limits.
OpenAI API usage requires enabling billing (paid API credits); ChatGPT subscriptions (e.g. Plus) do NOT include API usage.


Citation

Zenodo archive (v0.1.0) DOI: 10.5281/zenodo.17970494
GitHub release tag: v0.1.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_sc_curator-0.1.0.tar.gz (50.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_sc_curator-0.1.0-py3-none-any.whl (53.3 kB view details)

Uploaded Python 3

File details

Details for the file llm_sc_curator-0.1.0.tar.gz.

File metadata

  • Download URL: llm_sc_curator-0.1.0.tar.gz
  • Upload date:
  • Size: 50.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_sc_curator-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5a5ca5ec53c8b2f239ef55f2c7c09897b3be1a8b187291e1ccee3fe046f4d896
MD5 cd166eafe129592b4d8b701647591d22
BLAKE2b-256 a511d0c2543bc8d2d8feae0796ebdff1701e7b72e03e89293a89acacf790ec97

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_sc_curator-0.1.0.tar.gz:

Publisher: pypi-release.yml on kenflab/LLM-scCurator

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_sc_curator-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: llm_sc_curator-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 53.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for llm_sc_curator-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a062a54dfac6cb7a95eb70d8225808c5fa110c1765cd3404a931981a3212bce1
MD5 46454d584433e6dcc5d279ce24609c19
BLAKE2b-256 50fc5855f36b40acdd91375bf0343d3546a24cbb06837a941ee71f2504b007cd

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_sc_curator-0.1.0-py3-none-any.whl:

Publisher: pypi-release.yml on kenflab/LLM-scCurator

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page