Skip to main content

Optimized slide tiling library for histopathology

Project description

hs2p

PyPI version Python 3.10+ empty empty HuggingFace Space

hs2p is a Python package for fast, scalable whole-slide tiling. You can request tiles at any spacing, whether or not that spacing is natively present in the image pyramid. It is designed for computational pathology workflows that need reproducible coordinates.

We support two main workflows:

  • a Python API for library-style integration
  • a CLI for batch preprocessing

Demo

Try hs2p interactively: hs2p-demo on HuggingFace Spaces
You can adjust tiling parameters (spacing, tile size, tissue threshold, overlap) and instantly see a tiling preview and tissue mask overlay.
You can also upload your own pyramidal WSI (up to 1 GB).

Installation

pip install hs2p

Workflows

Tiling

Tiling computes a reproducible grid of tile coordinates for each slide and saves them as named artifacts with extraction metadata, ready for downstream use.
When a precomputed tissue mask is not provided, hs2p segments tissue on-the-fly. If you want to precompute tissue masks, a standalone script is available.

HS2P tiling workflow

Sampling

Sampling filters or partitions tile coordinates by annotation coverage so you can keep only tiles relevant to a tissue class or label.

HS2P sampling workflow

Python API

hs2p supports pre-extracted tissue masks. If you don't have such tissue masks, you can either:

  • use our standalone tissue segmentation script (Recommended)
  • tune the SegmentationConfig parameters and let hs2p segments tissue on the fly

Minimal tiling example:

from pathlib import Path

from hs2p import (
    SlideSpec,
    TilingConfig,
    overlay_mask_on_slide,
    save_tiling_result,
    tile_slide,
    write_tiling_preview,
)

result = tile_slide(
    SlideSpec(
        sample_id="slide-1",
        image_path=Path("/data/wsi/slide-1.tif"),
        mask_path=Path("/data/mask/slide-1.tif"),
    ),
    tiling=TilingConfig(
        backend="openslide",
        target_spacing_um=0.5,
        target_tile_size_px=224,
        tolerance=0.07,
        overlap=0.0,
        tissue_threshold=0.1,
    ),
)

artifacts = save_tiling_result(result, output_dir=Path("output"))

print(artifacts.tiles_npz_path)   # output/coordinates/slide-1.tiles.npz ; more info in docs/artifacts.md
print(artifacts.tiles_meta_path)  # output/coordinates/slide-1.tiles.meta.json ; more info in docs/artifacts.md

tiling_preview_path = write_tiling_preview(
    result=result,
    output_dir=Path("output"),
    downsample=32,
)
print(tiling_preview_path)  # output/visualization/tiling/slide-1.jpg ; low resolution preview of tiling result, good for QC

mask_overlay = overlay_mask_on_slide(
    wsi_path=result.image_path,
    annotation_mask_path=Path("/data/mask/slide-1.tif"),
    downsample=32,
    backend=result.backend,
)
mask_overlay.save("output/visualization/mask/slide-1.jpg")

result is a TilingResult for one slide. It gives downstream pipelines the tile coordinates plus the metadata needed to relate those coordinates back to the slide pyramid and persist them as reusable named artifacts.

More API details: docs/api.md

CLI

The CLI is intended for fast batch processing of multiple slides with the same config. Both CLI entrypoints expect the same input csv schema:

sample_id,image_path,mask_path
slide-1,/data/wsi/slide-1.tif,/data/mask/slide-1.tif
slide-2,/data/wsi/slide-2.tif,

For a first run, start from hs2p/configs/default.yaml and edit only the essentials:

  • csv
  • output_dir
  • tiling.backend
  • tiling.params.target_spacing_um
  • tiling.params.target_tile_size_px

Run tiling:

python -m hs2p.tiling --config-file /path/to/config.yaml

Run sampling:

python -m hs2p.sampling --config-file /path/to/config.yaml

For sampling, add tiling.sampling_params.pixel_mapping and tiling.sampling_params.tissue_percentage for the annotations you want to keep.

More CLI details: docs/cli.md

Outputs

hs2p writes explicit named artifacts rather than anonymous coordinate dumps.

  • Tiling writes coordinates/{sample_id}.tiles.npz and coordinates/{sample_id}.tiles.meta.json
  • Sampling writes the same pair under coordinates/<annotation>/
  • Batch runs also write process_list.csv
  • Saved coordinate arrays use a deterministic column-major order: numeric x first, then numeric y within each shared x

Artifact field reference: docs/artifacts.md

Docker

Docker Version

If you prefer running hs2p in a container, a published Docker image is available:

docker pull waticlems/hs2p:latest
docker run --rm -it -v /path/to/your/data:/data waticlems/hs2p:latest

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hs2p-2.2.0.tar.gz (67.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hs2p-2.2.0-py3-none-any.whl (48.1 kB view details)

Uploaded Python 3

File details

Details for the file hs2p-2.2.0.tar.gz.

File metadata

  • Download URL: hs2p-2.2.0.tar.gz
  • Upload date:
  • Size: 67.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for hs2p-2.2.0.tar.gz
Algorithm Hash digest
SHA256 1c7bb7ac449c6bda18a22b7aeb7246191f46408e3f00710c33e5a99486f70140
MD5 8564e923cbe1096dc039468aa12a3526
BLAKE2b-256 1ddc9859a1901c65c217d0c061d125f14a3d338d1089a6d6b63da3d72821e089

See more details on using hashes here.

File details

Details for the file hs2p-2.2.0-py3-none-any.whl.

File metadata

  • Download URL: hs2p-2.2.0-py3-none-any.whl
  • Upload date:
  • Size: 48.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for hs2p-2.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2fc94f6dc6f506f2b66e6d34fc5f344ec4b422c39f89d2d9b8b79d6add381c40
MD5 017af28d78c3367ebee841bd3ed4b938
BLAKE2b-256 b50423f92244f5b0a47e5953fc4e73357b3897d6f262a6380340e5adf7c38bbc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page