Skip to main content

Official DeepLife Python toolkit (PyPI): TwinCell (`deeplife_toolkit.twincell`), pseudo-bulk (`deeplife_toolkit.pseudobulk`), and differential expression (`deeplife_toolkit.differential_expression`). Optional `[notebook]` extra for Jupyter tutorial dependencies.

Project description

deeplife-toolkit

PyPI version CI Python 3.12+ License: MIT

Official DeepLife Python toolkit for TwinCell and related analysis: call the Open API from Python, validate and preprocess AnnData (.h5ad) locally, aggregate to pseudo-bulk with deeplife_toolkit.pseudobulk, and run sample-level differential expression with deeplife_toolkit.differential_expression (suitable for pseudo-bulk, bulk, or other compatible count tables).

Source: github.com/deeplifeai/deeplife-toolkit · Docs: docs/README.md · Python: 3.12+ (see pyproject.toml).


Install

pip install deeplife-toolkit

API key: you need a DeepLife key (usually dl_…). Create or copy one from the TwinCell console (sign in, open API keys / Keys, then generate or reveal a key). In code, set DEEPLIFE_API_KEY in the environment or pass api_key= when constructing the client. Your org may also document keys beside its Open API (Swagger) entrypoints.

From a git clone (contributors):

uv sync --group dev

Package layout

Import Role
deeplife_toolkit.twincell.api HTTP client (OpenDeepLifeClient, async variant), predictions, datasets, TwinCell / TwinCellStudy workflows, data catalog helpers, plotting
deeplife_toolkit.twincell.validation TwinCell-ready preprocessing and local validation (DeepLifePreprocessor, configs, CLI twincell-validate-h5ad)
deeplife_toolkit.pseudobulk, deeplife_toolkit.differential_expression Pseudo-bulk from single-cell AnnData, and sample-level DE (CLIs twincell-pseudobulk, twincell-diffexpr)

Install with pip install deeplife-toolkit (PyPI project name, hyphen). Import deeplife_toolkit.twincell, deeplife_toolkit.pseudobulk, and deeplife_toolkit.differential_expression (nested packages inside the deeplife_toolkit distribution).


Minimal API usage

End-to-end flow: preprocess/validate .h5adcreate_prediction → poll or watch_prediction → read results (and optional influence / causal helpers). The toolkit validates uploads the same way the TwinCell API does.

import os
from deeplife_toolkit.twincell.api import OpenDeepLifeClient

client = OpenDeepLifeClient(api_key=os.environ["DEEPLIFE_API_KEY"])
prediction = client.create_prediction(dataset="twincell_ready.h5ad")
final = client.wait_for_prediction(prediction_id=prediction.prediction_id)
print(final.status)

Defaults: the client uses the toolkit’s configured API base URL; pass base_url= for another environment. Retries apply to safe GET-style calls (polling), not duplicate uploads on POST. For TLS/proxy issues, use tls_verify= and trust_env= on the client—see OpenDeepLifeClient in deeplife_toolkit.twincell.api.

For preprocessing options (raw layer, disease filter, gene symbol column, etc.), use deeplife_toolkit.twincell.validation or the example notebooks.


Example notebooks

Curated tutorials live under notebooks/ (next to setup_deeplife_environment.sh). The PyPI wheel does not bundle these files; use a git checkout of this repository (or copy the .ipynb files) to run them.

kang_pbmc.ipynb is the Kang PBMC (IFN-β) tutorial: preprocess → pseudo-bulk PyDESeq2TwinCellStudy, scorecards, causal subgraphs, and simulate(), with an optional commented block for in-memory skin_atlas_2024.

Layout

Path Role
notebooks/ Jupyter TwinCell tutorial (kang_pbmc.ipynb).
notebooks/data/ Downloaded .h5ad and notebook outputs (gitignored). The notebook resolves the repo root from the kernel cwd or the installed deeplife_toolkit package path, then writes here when paths are relative to the repo.
notebooks/setup_deeplife_environment.sh Optional bootstrap: uv sync --group dev --extra notebook, optional AWS CodeArtifact token for internal indexes, and a Python (deeplife-toolkit) ipykernel. Run from anywhere; it cds to the repo root automatically.

Paths and API keys: use a git checkout so notebooks/data/ exists on disk. The notebook walks up from the kernel cwd for a pyproject.toml with name = "deeplife-toolkit", or falls back to the installed deeplife_toolkit package path. It loads Kang data over HTTP via get_adata_from_url and uses getpass (or DEEPLIFE_API_KEY) before TwinCell study flows.

Notebook When to use
notebooks/kang_pbmc.ipynb Kang PBMC (IFN-β): preprocess → pseudo-bulk PyDESeq2TwinCellStudy, scorecards, causal subgraphs, simulate().

If you installed deeplife-toolkit from PyPI, create a virtual environment with pip install deeplife-toolkit, clone this repo for the notebook, then install whatever you need to run Jupyter (for example pip install "deeplife-toolkit[notebook]", JupyterLab or Notebook, and seaborn for kang_pbmc.ipynb as written). Select that environment as the Jupyter kernel.

Example dataset helpers live in deeplife_toolkit.twincell.api.datasets (EXAMPLE_DATASETS is empty until you add HTTP(S) links; download_example_dataset streams those URLs). In-memory demo atlas: get_adata(dataset_name="skin_atlas_2024") from deeplife_toolkit.twincell.api.data.

PyPI releases can lag main; after pip install -U deeplife-toolkit, check that imports match the layout you expect.


Development

uv sync --group dev
make check-all    # or: ruff, mypy, pytest — see Makefile

CI / CD: .github/workflows/ci.yml runs on pushes and PRs to main / master: uv sync --frozen --group dev, Ruff (format + lint), mypy, pytest, then uv build and twine check --strict. Releases use .github/workflows/pypi-publish.yml (same checks, then PyPI via OIDC). .github/dependabot.yml bumps GitHub Actions dependencies weekly. Maintainer steps: README_PYPI_TODO.md.

Contributors who rely on internal CodeArtifact-backed uv indexes can run sh notebooks/setup_deeplife_environment.sh once (AWS SSO + uv sync --group dev --extra notebook + optional Jupyter kernel registration), then open the tutorials under notebooks/.


Publishing to PyPI (maintainers)

Use README_PYPI_TODO.md for trusted publishing, GitHub environment pypi, and version tags.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeplife_toolkit-0.1.0a1.tar.gz (518.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deeplife_toolkit-0.1.0a1-py3-none-any.whl (195.8 kB view details)

Uploaded Python 3

File details

Details for the file deeplife_toolkit-0.1.0a1.tar.gz.

File metadata

  • Download URL: deeplife_toolkit-0.1.0a1.tar.gz
  • Upload date:
  • Size: 518.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for deeplife_toolkit-0.1.0a1.tar.gz
Algorithm Hash digest
SHA256 8611627be0c2bcc1a53046ca5e2ff2ce1c495fc71ab54f4a5883b1a6c4d85347
MD5 b5d63384f8ae3a073f2dd164aaed3db4
BLAKE2b-256 167bc55637de0f3344cc044baa95535a0cf3f8d04a477918732fb70cc888b2f7

See more details on using hashes here.

File details

Details for the file deeplife_toolkit-0.1.0a1-py3-none-any.whl.

File metadata

File hashes

Hashes for deeplife_toolkit-0.1.0a1-py3-none-any.whl
Algorithm Hash digest
SHA256 59b755ea76bdae0412039ce3b7238c48cdf08af181a2401f909b8d88de3569ea
MD5 03c028edf86a62f16690aa2831348278
BLAKE2b-256 74fb33b6be09dc23e2bf76d0041c6ab4ad2ea19a65e788ee68b2946d1f737c6b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page