Skip to main content

Official DeepLife Python toolkit on PyPI: TwinCell (`deeplife_toolkit.twincell`), pseudo-bulk (`deeplife_toolkit.pseudobulk`), differential expression (`deeplife_toolkit.differential_expression`). Optional `[notebook]` extra for Jupyter; bundled tutorials under `deeplife_toolkit.notebooks`.

Project description

deeplife-toolkit

PyPI version CI Python 3.12+ License: MIT

Official DeepLife Python toolkit for TwinCell and related analysis: call the Open API from Python, validate and preprocess AnnData (.h5ad) locally, aggregate to pseudo-bulk with deeplife_toolkit.pseudobulk, and run sample-level differential expression with deeplife_toolkit.differential_expression (suitable for pseudo-bulk, bulk, or other compatible count tables).

Source: github.com/deeplifeai/deeplife-toolkit · Documentation: docs index on GitHub · Python: 3.12+ (see pyproject.toml).


Install

pip install deeplife-toolkit

API key: you need a DeepLife key (usually dl_…). Create or copy one from the TwinCell console (sign in, then open API keys / Keys). In code, set DEEPLIFE_API_KEY in the environment or pass api_key= when constructing the client. API documentation for your deployment is usually at {base_url}/docs when enabled (the client defaults to the production Open API host; pass base_url= for another environment).

Network: the TwinCell Open API is not on the public Internet today. It runs in DeepLife’s cluster and is reachable only when your machine has access to that network—for example by being signed into your organization’s Tailscale tailnet (or whatever access path your team documents). Local-only features (twincell-validate-h5ad, pseudo-bulk, differential expression on .h5ad) do not require API connectivity.

From a git clone (contributors):

uv sync --group dev

Documentation

Long-form guides (same content as the Documentation link on PyPI):

Guide Description
Installation pip install, optional [notebook], API keys, verify imports
Pseudo-bulk & differential expression AnnData → pseudo-bulk, sample-level DE, CLIs
TwinCell Open API HTTP client, predictions, studies, datasets

Index: docs/README.md.


Package layout

Import Role
deeplife_toolkit.twincell.api HTTP client (OpenDeepLifeClient, async variant), predictions, datasets, TwinCell / TwinCellStudy workflows, data catalog helpers, plotting
deeplife_toolkit.twincell.validation TwinCell-ready preprocessing and local validation (DeepLifePreprocessor, configs, CLI twincell-validate-h5ad)
deeplife_toolkit.pseudobulk, deeplife_toolkit.differential_expression Pseudo-bulk from single-cell AnnData, and sample-level DE (CLIs twincell-pseudobulk, twincell-diffexpr)
deeplife_toolkit.notebooks Bundled Jupyter tutorials (PyPI wheel); see packaged_notebooks_directory()

Install with pip install deeplife-toolkit (PyPI project name, hyphen). Import deeplife_toolkit.twincell, deeplife_toolkit.pseudobulk, and deeplife_toolkit.differential_expression (nested packages inside the deeplife_toolkit distribution).


Minimal API usage

End-to-end flow: preprocess/validate .h5adcreate_prediction → poll or watch_prediction → read results (and optional influence / causal helpers). The toolkit validates uploads the same way the TwinCell API does.

import os
from deeplife_toolkit.twincell.api import OpenDeepLifeClient

client = OpenDeepLifeClient(api_key=os.environ["DEEPLIFE_API_KEY"])
prediction = client.create_prediction(dataset="twincell_ready.h5ad")
final = client.wait_for_prediction(prediction_id=prediction.prediction_id)
print(final.status)

Defaults: the client uses the toolkit’s configured API base URL; pass base_url= for another environment. You must be able to reach that host from your network (see Network under Install). Retries apply to safe GET-style calls (polling), not duplicate uploads on POST. For TLS/proxy issues, use tls_verify= and trust_env= on the client—see OpenDeepLifeClient in deeplife_toolkit.twincell.api.

For preprocessing options (raw layer, disease filter, gene symbol column, etc.), use deeplife_toolkit.twincell.validation or the example notebooks.


Example notebooks

Tutorial .ipynb files are shipped inside the PyPI wheel under deeplife_toolkit/notebooks/ (no git clone required to obtain them). The Kang PBMC walkthrough is kang_pbmc.ipynb: preprocess → pseudo-bulk PyDESeq2TwinCellStudy, overlap scorecards, causal subgraphs, and simulate(), with an optional commented block for in-memory skin_atlas_2024.

Repository layout: the same file is edited in the repo at src/deeplife_toolkit/notebooks/kang_pbmc.ipynb. The top-level notebooks/ folder holds a short README and is reserved for generated data when you work from a clone.

Run the Kang tutorial (PyPI only)

  1. Create and activate a virtual environment, then:

    pip install "deeplife-toolkit[notebook]" jupyterlab seaborn
    
  2. (Optional) Register this environment as a Jupyter kernel so it appears by name in the UI:

    python -m ipykernel.kernelspec --user --name deeplife-toolkit --display-name "Python (deeplife-toolkit)"
    
  3. Start JupyterLab using the bundled notebook directory as the server root (macOS / Linux):

    jupyter lab --notebook-dir="$(python -c "from deeplife_toolkit.notebooks import packaged_notebooks_directory; print(packaged_notebooks_directory())")"
    

    Windows (PowerShell):

    $nb = python -c "from deeplife_toolkit.notebooks import packaged_notebooks_directory; print(packaged_notebooks_directory())"
    jupyter lab --notebook-dir="$nb"
    
  4. In JupyterLab, open kang_pbmc.ipynb. Choose the kernel from step 2 (or the interpreter where you ran pip install).

The bundled directory usually lives under site-packages and may be read-only on some systems. If saving notebook outputs fails, copy kang_pbmc.ipynb to a writable folder, run jupyter lab from there, and open the copy.

The Kang dataset is fetched over HTTPS inside the notebook (get_adata_from_url). Part 2 (TwinCell HTTP) needs DEEPLIFE_API_KEY or a one-time getpass prompt, plus network access to the API (for example Tailscale as your org documents).

Location Role
deeplife_toolkit.notebooks.packaged_notebooks_directory() Path to bundled .ipynb files after pip install.
notebooks/ in a git clone README + local download/output paths (gitignored data).

Contributors who work from a git clone can use uv sync --group dev --extra notebook instead of pip if you prefer the locked lockfile.

Example dataset helpers live in deeplife_toolkit.twincell.api.datasets (EXAMPLE_DATASETS is empty until you add HTTP(S) links; download_example_dataset streams those URLs). In-memory demo atlas: get_adata(dataset_name="skin_atlas_2024") from deeplife_toolkit.twincell.api.data.

After pip install -U deeplife-toolkit, confirm imports match the layout you expect; the latest release is always on PyPI.


Development

uv sync --group dev
make check-all    # or: ruff, mypy, pytest — see Makefile

CI: .github/workflows/ci.yml runs on pushes and PRs to main / master: uv sync --frozen --group dev, Ruff, mypy, pytest, uv build, twine check --strict. .github/dependabot.yml bumps GitHub Actions weekly.

Releases to PyPI: .github/workflows/pypi-publish.yml runs the same checks, then publishes with OIDC trusted publishing (GitHub environment pypi, trusted publisher configured on the deeplife-toolkit PyPI project). Bump version in pyproject.toml, push to main, then either push a tag matching v* or run the workflow manually from the Actions tab.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeplife_toolkit-0.1.7.tar.gz (379.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deeplife_toolkit-0.1.7-py3-none-any.whl (225.8 kB view details)

Uploaded Python 3

File details

Details for the file deeplife_toolkit-0.1.7.tar.gz.

File metadata

  • Download URL: deeplife_toolkit-0.1.7.tar.gz
  • Upload date:
  • Size: 379.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for deeplife_toolkit-0.1.7.tar.gz
Algorithm Hash digest
SHA256 e523bf613793c1d8272e9a9dff58566e96d35cc157a9e314fc778ea5c0dde985
MD5 d88eb6100c8487fbe47aa12140d57a1a
BLAKE2b-256 21326c6ec3387dc67bd79ff2539449c374f280de2daf5a11e7d9c42b88c8c0ae

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.7.tar.gz:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file deeplife_toolkit-0.1.7-py3-none-any.whl.

File metadata

File hashes

Hashes for deeplife_toolkit-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 28c3955d31a75152b7464d941c9637b78ab3ef121de068a25164cf14797b5105
MD5 1c81e5e488574c0a39cc8d37ad6a7eb0
BLAKE2b-256 fa50c2306c69a95f423e0e090cc5afdf5b10b9e544086a666e200b706831efb7

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.7-py3-none-any.whl:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page