Skip to main content

Official DeepLife Python toolkit on PyPI: TwinCell (`deeplife_toolkit.twincell`), pseudo-bulk (`deeplife_toolkit.pseudobulk`), differential expression (`deeplife_toolkit.differential_expression`). Optional `[notebook]` extra for Jupyter tutorials.

Project description

deeplife-toolkit

PyPI version CI Python 3.12+ License: MIT

Official DeepLife Python toolkit for TwinCell and related analysis: call the Open API from Python, validate and preprocess AnnData (.h5ad) locally, aggregate to pseudo-bulk with deeplife_toolkit.pseudobulk, and run sample-level differential expression with deeplife_toolkit.differential_expression (suitable for pseudo-bulk, bulk, or other compatible count tables).

Source: github.com/deeplifeai/deeplife-toolkit · Documentation: docs index on GitHub · Python: 3.12+ (see pyproject.toml).


Install

pip install deeplife-toolkit

API key: you need a DeepLife key (usually dl_…). Create or copy one from the TwinCell console (sign in, then open API keys / Keys). In code, set DEEPLIFE_API_KEY in the environment or pass api_key= when constructing the client. API documentation for your deployment is usually at {base_url}/docs when enabled (the client defaults to the production Open API; pass base_url= for another environment).

From a git clone (contributors):

uv sync --group dev

Documentation

Long-form guides (same content as the Documentation link on PyPI):

Guide Description
Installation pip install, optional [notebook], API keys, verify imports
Pseudo-bulk & differential expression AnnData → pseudo-bulk, sample-level DE, CLIs
TwinCell Open API HTTP client, predictions, studies, datasets

Index: docs/README.md.


Package layout

Import Role
deeplife_toolkit.twincell.api HTTP client (OpenDeepLifeClient, async variant), predictions, datasets, TwinCell / TwinCellStudy workflows, data catalog helpers, plotting
deeplife_toolkit.twincell.validation TwinCell-ready preprocessing and local validation (DeepLifePreprocessor, configs, CLI twincell-validate-h5ad)
deeplife_toolkit.pseudobulk, deeplife_toolkit.differential_expression Pseudo-bulk from single-cell AnnData, and sample-level DE (CLIs twincell-pseudobulk, twincell-diffexpr)

Install with pip install deeplife-toolkit (PyPI project name, hyphen). Import deeplife_toolkit.twincell, deeplife_toolkit.pseudobulk, and deeplife_toolkit.differential_expression (nested packages inside the deeplife_toolkit distribution).


Minimal API usage

End-to-end flow: preprocess/validate .h5adcreate_prediction → poll or watch_prediction → read results (and optional influence / causal helpers). The toolkit validates uploads the same way the TwinCell API does.

import os
from deeplife_toolkit.twincell.api import OpenDeepLifeClient

client = OpenDeepLifeClient(api_key=os.environ["DEEPLIFE_API_KEY"])
prediction = client.create_prediction(dataset="twincell_ready.h5ad")
final = client.wait_for_prediction(prediction_id=prediction.prediction_id)
print(final.status)

Defaults: the client uses the toolkit’s configured API base URL; pass base_url= for another environment. Retries apply to safe GET-style calls (polling), not duplicate uploads on POST. For TLS/proxy issues, use tls_verify= and trust_env= on the client—see OpenDeepLifeClient in deeplife_toolkit.twincell.api.

For preprocessing options (raw layer, disease filter, gene symbol column, etc.), use deeplife_toolkit.twincell.validation or the example notebooks.


Example notebooks

Tutorial notebooks live in this repository under notebooks/ and are not inside the PyPI wheel. Install deeplife-toolkit from PyPI, then open the .ipynb from a git clone of this repo or from a single notebook file you downloaded from GitHub.

kang_pbmc.ipynb (Kang PBMC IFN-β): preprocess → pseudo-bulk PyDESeq2TwinCellStudy, scorecards, causal subgraphs, and simulate(), with an optional commented block for in-memory skin_atlas_2024.

Run the Kang tutorial (PyPI install)

  1. Create and activate a virtual environment (recommended), then upgrade pip.

  2. Install the library, Jupyter, the [notebook] extra, and seaborn (used in the notebook):

    pip install "deeplife-toolkit[notebook]" jupyterlab seaborn
    
  3. Get the notebook on disk, for example:

    git clone https://github.com/deeplifeai/deeplife-toolkit.git
    cd deeplife-toolkit
    
  4. Start Jupyter from the repository root so relative paths in the notebook resolve:

    jupyter lab
    

    Open notebooks/kang_pbmc.ipynb. In Kernel → Change kernel, pick the Python environment where you ran pip install (if it is missing from the list, run python -m ipykernel install --user --name deeplife-toolkit --display-name "Python (deeplife-toolkit)" inside that environment and choose that kernel).

The Kang dataset is fetched over HTTPS inside the notebook (get_adata_from_url). Part 2 (TwinCell HTTP) needs DEEPLIFE_API_KEY or a one-time getpass prompt.

Path Role
notebooks/ Tutorial sources (kang_pbmc.ipynb).
notebooks/data/ Downloaded .h5ad and outputs (gitignored in the repo; created locally when you run cells).

Contributors who work from a git clone can use uv sync --group dev --extra notebook instead of pip if you prefer the locked lockfile.

Example dataset helpers live in deeplife_toolkit.twincell.api.datasets (EXAMPLE_DATASETS is empty until you add HTTP(S) links; download_example_dataset streams those URLs). In-memory demo atlas: get_adata(dataset_name="skin_atlas_2024") from deeplife_toolkit.twincell.api.data.

After pip install -U deeplife-toolkit, confirm imports match the layout you expect; the latest release is always on PyPI.


Development

uv sync --group dev
make check-all    # or: ruff, mypy, pytest — see Makefile

CI: .github/workflows/ci.yml runs on pushes and PRs to main / master: uv sync --frozen --group dev, Ruff, mypy, pytest, uv build, twine check --strict. .github/dependabot.yml bumps GitHub Actions weekly.

Releases to PyPI: .github/workflows/pypi-publish.yml runs the same checks, then publishes with OIDC trusted publishing (GitHub environment pypi, trusted publisher configured on the deeplife-toolkit PyPI project). Bump version in pyproject.toml, push to main, then either push a tag matching v* or run the workflow manually from the Actions tab.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeplife_toolkit-0.1.1.tar.gz (515.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deeplife_toolkit-0.1.1-py3-none-any.whl (195.9 kB view details)

Uploaded Python 3

File details

Details for the file deeplife_toolkit-0.1.1.tar.gz.

File metadata

  • Download URL: deeplife_toolkit-0.1.1.tar.gz
  • Upload date:
  • Size: 515.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for deeplife_toolkit-0.1.1.tar.gz
Algorithm Hash digest
SHA256 da8a296c4947554c9915b2f016b4d474259fb4b26e4bf7afe5dff3d34939f882
MD5 fd5c3f017885135d66f5c3e9945b9d5e
BLAKE2b-256 11922aae72adf4277ccbb9bd08b8a1ae99cc5ac18cbee01dc22e34b1cd31f252

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.1.tar.gz:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file deeplife_toolkit-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for deeplife_toolkit-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ad0dfc735b0df5e1c65b06e8762a5b5ccf3b6de185831fccd9d1143a6875eef5
MD5 f61a4a7ab56b10f1ca24c305cea02c6e
BLAKE2b-256 18469bc15837e2835f92e5eea2f185e486943792ae0ac1ec4a997138089aeb22

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.1-py3-none-any.whl:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page