Skip to main content

Official DeepLife Python toolkit on PyPI: TwinCell (`deeplife_toolkit.twincell`), pseudo-bulk (`deeplife_toolkit.pseudobulk`), differential expression (`deeplife_toolkit.differential_expression`). Optional `[notebook]` extra for Jupyter; bundled tutorials under `deeplife_toolkit.notebooks`.

Project description

deeplife-toolkit

PyPI version CI Python 3.12+ License: MIT

Official DeepLife Python toolkit for TwinCell and related analysis: call the Open API from Python, validate and preprocess AnnData (.h5ad) locally, aggregate to pseudo-bulk with deeplife_toolkit.pseudobulk, and run sample-level differential expression with deeplife_toolkit.differential_expression (suitable for pseudo-bulk, bulk, or other compatible count tables).

Source: github.com/deeplifeai/deeplife-toolkit · Documentation: docs index on GitHub · Python: 3.12+ (see pyproject.toml).


Install

pip install deeplife-toolkit

API key: you need a DeepLife key (usually dl_…). Create or copy one from the TwinCell console (sign in, then open API keys / Keys). In code, set DEEPLIFE_API_KEY in the environment or pass api_key= when constructing the client. API documentation for your deployment is usually at {base_url}/docs when enabled (the client defaults to the production Open API host; pass base_url= for another environment).

Network: the TwinCell Open API is not on the public Internet today. It runs in DeepLife’s cluster and is reachable only when your machine has access to that network—for example by being signed into your organization’s Tailscale tailnet (or whatever access path your team documents). Local-only features (twincell-validate-h5ad, pseudo-bulk, differential expression on .h5ad) do not require API connectivity.

From a git clone (contributors):

uv sync --group dev

Documentation

Long-form guides (same content as the Documentation link on PyPI):

Guide Description
Installation pip install, optional [notebook], API keys, verify imports
Pseudo-bulk & differential expression AnnData → pseudo-bulk, sample-level DE, CLIs
TwinCell Open API HTTP client, predictions, studies, datasets

Index: docs/README.md.


Package layout

Import Role
deeplife_toolkit.twincell.api HTTP client (OpenDeepLifeClient, async variant), predictions, datasets, TwinCell / TwinCellStudy workflows, data catalog helpers, plotting
deeplife_toolkit.twincell.validation TwinCell-ready preprocessing and local validation (DeepLifePreprocessor, configs, CLI twincell-validate-h5ad)
deeplife_toolkit.pseudobulk, deeplife_toolkit.differential_expression Pseudo-bulk from single-cell AnnData, and sample-level DE (CLIs twincell-pseudobulk, twincell-diffexpr)
deeplife_toolkit.notebooks Bundled Jupyter tutorials (PyPI wheel); see packaged_notebooks_directory()

Install with pip install deeplife-toolkit (PyPI project name, hyphen). Import deeplife_toolkit.twincell, deeplife_toolkit.pseudobulk, and deeplife_toolkit.differential_expression (nested packages inside the deeplife_toolkit distribution).


Minimal API usage

End-to-end flow: preprocess/validate .h5adcreate_prediction → poll or watch_prediction → read results (and optional influence / causal helpers). The toolkit validates uploads the same way the TwinCell API does.

import os
from deeplife_toolkit.twincell.api import OpenDeepLifeClient

client = OpenDeepLifeClient(api_key=os.environ["DEEPLIFE_API_KEY"])
prediction = client.create_prediction(dataset="twincell_ready.h5ad")
final = client.wait_for_prediction(prediction_id=prediction.prediction_id)
print(final.status)

Defaults: the client uses the toolkit’s configured API base URL; pass base_url= for another environment. You must be able to reach that host from your network (see Network under Install). Retries apply to safe GET-style calls (polling), not duplicate uploads on POST. For TLS/proxy issues, use tls_verify= and trust_env= on the client—see OpenDeepLifeClient in deeplife_toolkit.twincell.api.

For preprocessing options (raw layer, disease filter, gene symbol column, etc.), use deeplife_toolkit.twincell.validation or the example notebooks.


Example notebooks

Tutorial .ipynb files are shipped inside the PyPI wheel under deeplife_toolkit/notebooks/ (no git clone required to obtain them). The Kang PBMC walkthrough is kang_pbmc.ipynb: preprocess → pseudo-bulk PyDESeq2TwinCellStudy, scorecards, causal subgraphs, and simulate(), with an optional commented block for in-memory skin_atlas_2024.

Repository layout: the same file is edited in the repo at src/deeplife_toolkit/notebooks/kang_pbmc.ipynb. The top-level notebooks/ folder holds a short README and is reserved for generated data when you work from a clone.

Run the Kang tutorial (PyPI only)

  1. Create and activate a virtual environment, then:

    pip install "deeplife-toolkit[notebook]" jupyterlab seaborn
    
  2. (Optional) Register this environment as a Jupyter kernel so it appears by name in the UI:

    python -m ipykernel.kernelspec --user --name deeplife-toolkit --display-name "Python (deeplife-toolkit)"
    
  3. Start JupyterLab using the bundled notebook directory as the server root (macOS / Linux):

    jupyter lab --notebook-dir="$(python -c "from deeplife_toolkit.notebooks import packaged_notebooks_directory; print(packaged_notebooks_directory())")"
    

    Windows (PowerShell):

    $nb = python -c "from deeplife_toolkit.notebooks import packaged_notebooks_directory; print(packaged_notebooks_directory())"
    jupyter lab --notebook-dir="$nb"
    
  4. In JupyterLab, open kang_pbmc.ipynb. Choose the kernel from step 2 (or the interpreter where you ran pip install).

The bundled directory usually lives under site-packages and may be read-only on some systems. If saving notebook outputs fails, copy kang_pbmc.ipynb to a writable folder, run jupyter lab from there, and open the copy.

The Kang dataset is fetched over HTTPS inside the notebook (get_adata_from_url). Part 2 (TwinCell HTTP) needs DEEPLIFE_API_KEY or a one-time getpass prompt, plus network access to the API (for example Tailscale as your org documents).

Location Role
deeplife_toolkit.notebooks.packaged_notebooks_directory() Path to bundled .ipynb files after pip install.
notebooks/ in a git clone README + local download/output paths (gitignored data).

Contributors who work from a git clone can use uv sync --group dev --extra notebook instead of pip if you prefer the locked lockfile.

Example dataset helpers live in deeplife_toolkit.twincell.api.datasets (EXAMPLE_DATASETS is empty until you add HTTP(S) links; download_example_dataset streams those URLs). In-memory demo atlas: get_adata(dataset_name="skin_atlas_2024") from deeplife_toolkit.twincell.api.data.

After pip install -U deeplife-toolkit, confirm imports match the layout you expect; the latest release is always on PyPI.


Development

uv sync --group dev
make check-all    # or: ruff, mypy, pytest — see Makefile

CI: .github/workflows/ci.yml runs on pushes and PRs to main / master: uv sync --frozen --group dev, Ruff, mypy, pytest, uv build, twine check --strict. .github/dependabot.yml bumps GitHub Actions weekly.

Releases to PyPI: .github/workflows/pypi-publish.yml runs the same checks, then publishes with OIDC trusted publishing (GitHub environment pypi, trusted publisher configured on the deeplife-toolkit PyPI project). Bump version in pyproject.toml, push to main, then either push a tag matching v* or run the workflow manually from the Actions tab.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeplife_toolkit-0.1.4.tar.gz (518.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deeplife_toolkit-0.1.4-py3-none-any.whl (375.1 kB view details)

Uploaded Python 3

File details

Details for the file deeplife_toolkit-0.1.4.tar.gz.

File metadata

  • Download URL: deeplife_toolkit-0.1.4.tar.gz
  • Upload date:
  • Size: 518.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for deeplife_toolkit-0.1.4.tar.gz
Algorithm Hash digest
SHA256 51198794891f47553ea12b41b794d1623ee68f1bc8b3f9f70dc9f90fb5b0b59c
MD5 fdb81584ad0a36d72957045b1fd8a58e
BLAKE2b-256 e3ce5a35329b89e63b3a5edbbfb844542518e1a612af6e4bd96840a28fab0440

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.4.tar.gz:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file deeplife_toolkit-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for deeplife_toolkit-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e56c2d49f7acd074e29d9b1bee52cbf804b14ac7aa998c2cd771d3dc88664fb3
MD5 a874d61efcb37aecff6b96edb3e21101
BLAKE2b-256 4a8538778fc3e5ee623bfc8f35f788584fe4e7d8cdfaacf2e0d6b753e0771fd3

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.4-py3-none-any.whl:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page