Skip to main content

Official DeepLife Python toolkit on PyPI: TwinCell (`deeplife_toolkit.twincell`), pseudo-bulk (`deeplife_toolkit.pseudobulk`), differential expression (`deeplife_toolkit.differential_expression`). Optional `[notebook]` extra for Jupyter tutorials.

Project description

deeplife-toolkit

PyPI version CI Python 3.12+ License: MIT

Official DeepLife Python toolkit for TwinCell and related analysis: call the Open API from Python, validate and preprocess AnnData (.h5ad) locally, aggregate to pseudo-bulk with deeplife_toolkit.pseudobulk, and run sample-level differential expression with deeplife_toolkit.differential_expression (suitable for pseudo-bulk, bulk, or other compatible count tables).

Source: github.com/deeplifeai/deeplife-toolkit · Documentation: docs index on GitHub · Python: 3.12+ (see pyproject.toml).


Install

pip install deeplife-toolkit

API key: you need a DeepLife key (usually dl_…). Create or copy one from the TwinCell console (sign in, then open API keys / Keys). In code, set DEEPLIFE_API_KEY in the environment or pass api_key= when constructing the client. API documentation for your deployment is usually at {base_url}/docs when enabled (the client defaults to the production Open API host; pass base_url= for another environment).

Network: the TwinCell Open API is not on the public Internet today. It runs in DeepLife’s cluster and is reachable only when your machine has access to that network—for example by being signed into your organization’s Tailscale tailnet (or whatever access path your team documents). Local-only features (twincell-validate-h5ad, pseudo-bulk, differential expression on .h5ad) do not require API connectivity.

From a git clone (contributors):

uv sync --group dev

Documentation

Long-form guides (same content as the Documentation link on PyPI):

Guide Description
Installation pip install, optional [notebook], API keys, verify imports
Pseudo-bulk & differential expression AnnData → pseudo-bulk, sample-level DE, CLIs
TwinCell Open API HTTP client, predictions, studies, datasets

Index: docs/README.md.


Package layout

Import Role
deeplife_toolkit.twincell.api HTTP client (OpenDeepLifeClient, async variant), predictions, datasets, TwinCell / TwinCellStudy workflows, data catalog helpers, plotting
deeplife_toolkit.twincell.validation TwinCell-ready preprocessing and local validation (DeepLifePreprocessor, configs, CLI twincell-validate-h5ad)
deeplife_toolkit.pseudobulk, deeplife_toolkit.differential_expression Pseudo-bulk from single-cell AnnData, and sample-level DE (CLIs twincell-pseudobulk, twincell-diffexpr)

Install with pip install deeplife-toolkit (PyPI project name, hyphen). Import deeplife_toolkit.twincell, deeplife_toolkit.pseudobulk, and deeplife_toolkit.differential_expression (nested packages inside the deeplife_toolkit distribution).


Minimal API usage

End-to-end flow: preprocess/validate .h5adcreate_prediction → poll or watch_prediction → read results (and optional influence / causal helpers). The toolkit validates uploads the same way the TwinCell API does.

import os
from deeplife_toolkit.twincell.api import OpenDeepLifeClient

client = OpenDeepLifeClient(api_key=os.environ["DEEPLIFE_API_KEY"])
prediction = client.create_prediction(dataset="twincell_ready.h5ad")
final = client.wait_for_prediction(prediction_id=prediction.prediction_id)
print(final.status)

Defaults: the client uses the toolkit’s configured API base URL; pass base_url= for another environment. You must be able to reach that host from your network (see Network under Install). Retries apply to safe GET-style calls (polling), not duplicate uploads on POST. For TLS/proxy issues, use tls_verify= and trust_env= on the client—see OpenDeepLifeClient in deeplife_toolkit.twincell.api.

For preprocessing options (raw layer, disease filter, gene symbol column, etc.), use deeplife_toolkit.twincell.validation or the example notebooks.


Example notebooks

Tutorial notebooks live in this repository under notebooks/ and are not inside the PyPI wheel. Install deeplife-toolkit from PyPI, then open the .ipynb from a git clone of this repo or from a single notebook file you downloaded from GitHub.

kang_pbmc.ipynb (Kang PBMC IFN-β): preprocess → pseudo-bulk PyDESeq2TwinCellStudy, scorecards, causal subgraphs, and simulate(), with an optional commented block for in-memory skin_atlas_2024.

Run the Kang tutorial (PyPI install)

  1. Create and activate a virtual environment (recommended), then upgrade pip.

  2. Install the library, Jupyter, the [notebook] extra, and seaborn (used in the notebook):

    pip install "deeplife-toolkit[notebook]" jupyterlab seaborn
    
  3. Get the notebook on disk, for example:

    git clone https://github.com/deeplifeai/deeplife-toolkit.git
    cd deeplife-toolkit
    
  4. Start Jupyter from the repository root so relative paths in the notebook resolve:

    jupyter lab
    

    Open notebooks/kang_pbmc.ipynb. In Kernel → Change kernel, pick the Python environment where you ran pip install (if it is missing from the list, run python -m ipykernel install --user --name deeplife-toolkit --display-name "Python (deeplife-toolkit)" inside that environment and choose that kernel).

The Kang dataset is fetched over HTTPS inside the notebook (get_adata_from_url). Part 2 (TwinCell HTTP) needs DEEPLIFE_API_KEY or a one-time getpass prompt.

Path Role
notebooks/ Tutorial sources (kang_pbmc.ipynb).
notebooks/data/ Downloaded .h5ad and outputs (gitignored in the repo; created locally when you run cells).

Contributors who work from a git clone can use uv sync --group dev --extra notebook instead of pip if you prefer the locked lockfile.

Example dataset helpers live in deeplife_toolkit.twincell.api.datasets (EXAMPLE_DATASETS is empty until you add HTTP(S) links; download_example_dataset streams those URLs). In-memory demo atlas: get_adata(dataset_name="skin_atlas_2024") from deeplife_toolkit.twincell.api.data.

After pip install -U deeplife-toolkit, confirm imports match the layout you expect; the latest release is always on PyPI.


Development

uv sync --group dev
make check-all    # or: ruff, mypy, pytest — see Makefile

CI: .github/workflows/ci.yml runs on pushes and PRs to main / master: uv sync --frozen --group dev, Ruff, mypy, pytest, uv build, twine check --strict. .github/dependabot.yml bumps GitHub Actions weekly.

Releases to PyPI: .github/workflows/pypi-publish.yml runs the same checks, then publishes with OIDC trusted publishing (GitHub environment pypi, trusted publisher configured on the deeplife-toolkit PyPI project). Bump version in pyproject.toml, push to main, then either push a tag matching v* or run the workflow manually from the Actions tab.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deeplife_toolkit-0.1.2.tar.gz (516.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deeplife_toolkit-0.1.2-py3-none-any.whl (196.1 kB view details)

Uploaded Python 3

File details

Details for the file deeplife_toolkit-0.1.2.tar.gz.

File metadata

  • Download URL: deeplife_toolkit-0.1.2.tar.gz
  • Upload date:
  • Size: 516.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for deeplife_toolkit-0.1.2.tar.gz
Algorithm Hash digest
SHA256 d60874f11dfdcbca808145211b4ea1394434dca3d1dd110e80616262006bddd4
MD5 d127e98a0c41f8c7c740cf601882a47e
BLAKE2b-256 455c8c70d2551d7d0bc7034cf3a3846f532e8befc9e21f545ee282fb8bac5504

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.2.tar.gz:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file deeplife_toolkit-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for deeplife_toolkit-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 41c0e1492a31ef9b38e638b291264a99ad1272f98415bc78e174a810ff43387a
MD5 fbaba82dcc462a032e976b5ca88c930f
BLAKE2b-256 54b69bf3d75e09a7150acf57f3c2e67eab278a195eb1d0e21187c9c1264cabf4

See more details on using hashes here.

Provenance

The following attestation bundles were made for deeplife_toolkit-0.1.2-py3-none-any.whl:

Publisher: pypi-publish.yml on deeplifeai/deeplife-toolkit

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page