Skip to main content

Optimized slide tiling library for histopathology

Project description

hs2p

PyPI version Python 3.10+ empty empty HuggingFace Space

hs2p is a Python package for efficient slide tiling and tile sampling at any requested spacing, whether or not that spacing is natively present in the whole-slide image. It is designed for computational pathology workflows that need reproducible coordinates.

We support two main workflows:

  • a Python API for library-style integration
  • a CLI for batch preprocessing

Demo

Try hs2p interactively: hs2p-demo on HuggingFace Spaces
You can adjust tiling parameters (spacing, tile size, tissue threshold, overlap) and instantly see a tiling preview and tissue mask overlay.
You can also upload your own pyramidal WSI (up to 1 GB).

Installation

pip install hs2p

Workflows

Tiling

Tiling computes a reproducible grid of tile coordinates for each slide and saves them as named artifacts with extraction metadata, ready for downstream use.
When a precomputed tissue mask is not provided, hs2p segments tissue on-the-fly. If you want to precompute tissue masks, a standalone script is available.

HS2P tiling workflow

Sampling

Sampling filters or partitions tile coordinates by annotation coverage so you can keep only tiles relevant to a tissue class or label.

HS2P sampling workflow

Python API

hs2p supports pre-extracted tissue masks. If you don't have such tissue masks, you can either:

  • use our standalone tissue segmentation script (Recommended)
  • tune the SegmentationConfig parameters and let hs2p segments tissue on the fly

Minimal tiling example:

from pathlib import Path

from hs2p import (
    SlideSpec,
    TilingConfig,
    overlay_mask_on_slide,
    save_tiling_result,
    tile_slide,
    write_tiling_preview,
)

result = tile_slide(
    SlideSpec(
        sample_id="slide-1",
        image_path=Path("/data/wsi/slide-1.tif"),
        mask_path=Path("/data/mask/slide-1.tif"),
    ),
    tiling=TilingConfig(
        backend="openslide",
        target_spacing_um=0.5,
        target_tile_size_px=224,
        tolerance=0.07,
        overlap=0.0,
        tissue_threshold=0.1,
    ),
)

artifacts = save_tiling_result(result, output_dir=Path("output"))

print(artifacts.tiles_npz_path)   # output/coordinates/slide-1.tiles.npz ; more info in docs/artifacts.md
print(artifacts.tiles_meta_path)  # output/coordinates/slide-1.tiles.meta.json ; more info in docs/artifacts.md

tiling_preview_path = write_tiling_preview(
    result=result,
    output_dir=Path("output"),
    downsample=32,
)
print(tiling_preview_path)  # output/visualization/tiling/slide-1.jpg ; low resolution preview of tiling result, good for QC

mask_overlay = overlay_mask_on_slide(
    wsi_path=result.image_path,
    annotation_mask_path=Path("/data/mask/slide-1.tif"),
    downsample=32,
    backend=result.backend,
)
mask_overlay.save("output/visualization/mask/slide-1.jpg")

result is a TilingResult for one slide. It gives downstream pipelines the tile coordinates plus the metadata needed to relate those coordinates back to the slide pyramid and persist them as reusable named artifacts.

More API details: docs/api.md

CLI

The CLI is intended for fast batch processing of multiple slides with the same config. Both CLI entrypoints expect the same input csv schema:

sample_id,image_path,mask_path
slide-1,/data/wsi/slide-1.tif,/data/mask/slide-1.tif
slide-2,/data/wsi/slide-2.tif,

For a first run, start from hs2p/configs/default.yaml and edit only the essentials:

  • csv
  • output_dir
  • tiling.backend
  • tiling.params.target_spacing_um
  • tiling.params.target_tile_size_px

Run tiling:

python -m hs2p.tiling --config-file /path/to/config.yaml

Run sampling:

python -m hs2p.sampling --config-file /path/to/config.yaml

For sampling, add tiling.sampling_params.pixel_mapping and tiling.sampling_params.tissue_percentage for the annotations you want to keep.

More CLI details: docs/cli.md

Outputs

hs2p writes explicit named artifacts rather than anonymous coordinate dumps.

  • Tiling writes coordinates/{sample_id}.tiles.npz and coordinates/{sample_id}.tiles.meta.json
  • Sampling writes the same pair under coordinates/<annotation>/
  • Batch runs also write process_list.csv
  • Saved coordinate arrays use a deterministic column-major order: numeric x first, then numeric y within each shared x

Artifact field reference: docs/artifacts.md

Docker

Docker Version

If you prefer running hs2p in a container, a published Docker image is available:

docker pull waticlems/hs2p:latest
docker run --rm -it -v /path/to/your/data:/data waticlems/hs2p:latest

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hs2p-2.1.0.tar.gz (63.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hs2p-2.1.0-py3-none-any.whl (46.2 kB view details)

Uploaded Python 3

File details

Details for the file hs2p-2.1.0.tar.gz.

File metadata

  • Download URL: hs2p-2.1.0.tar.gz
  • Upload date:
  • Size: 63.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for hs2p-2.1.0.tar.gz
Algorithm Hash digest
SHA256 efba50f2ebe6f9363aa495fbe680841718e91b24680896fb1b4d3ab1d48463d7
MD5 4663346431a4270202c2177ad5c9a864
BLAKE2b-256 f4dc352f1a318c2a84b321cf0f13743fa39b5e6b7be61e623cd2cbb1a78284c3

See more details on using hashes here.

File details

Details for the file hs2p-2.1.0-py3-none-any.whl.

File metadata

  • Download URL: hs2p-2.1.0-py3-none-any.whl
  • Upload date:
  • Size: 46.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for hs2p-2.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c720b843fa1e729f907c5a376b480075a2cdf21ef196c2f81565521775ffa68c
MD5 ea6a3c8495921d88ea1eddc5ef166529
BLAKE2b-256 94c6fb39a707ad2ddf4511123bec0b36db663a486808a5a5bf7081bef8b633ae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page