Skip to main content

An automated deep learning pipeline for segmentation of the scapula, humerus, and their respective subregions in CT scans.

Project description

armcortnet

PyPI Latest Release Code style: black

Armcortnet provides automatic segmentation of the humerus and scapula from CT scans. The deep learning model is trained to also segment out the cortical and trabecular subregions from each bone as well.

The deep learning pipeple consists of using armcrop to crop to an oriented bounding box around each humerus or scapula in the image and then a neural network based traine from the nnUNet framework segments that cropped volume. The segmetnation is then transformed back to the original coordinate system, post-processed and finally saved as a .seg.nrrd file.

Installation

Please install pytorch first before installing armcortnet. You can learn about installing pytorch from the official website here.

Then install armcortnet using pip:

pip install armcortnet

For faster oriented bounding box cropping you can replace onnxruntime with onnxruntime-gpu.

Usage

To generate a segmentation of the humerus or scapula from a CT volume, use the following:

import armcortnet
import SimpleITK as sitk

# initialize the segmentation model
model = armcortnet.Net(bone_type="scapula")  # or "humerus"

# perform segmentation prediction on a CT volume
pred_segmentations = model.predict(
    vol_path="path/to/input/ct.nrrd"
)
# output is a list of SimpleITK images, one for each bone_type detected in the CT
for i, pred_seg in enumerate(pred_segmentations):
    # write each of the segmentations to the disk
    sitk.WriteImage(pred_seg, f"scapula-{i}.seg.nrrd")

A mesh of the predicted bone can be generated using the following:

# perform mesh prediction on a CT volume, returns list of vtkPolyData objects
pred_meshes = model.predict_poly(
    vol_path="path/to/input/ct.nrrd"
)

# iterate over each detected object
for i, cort_trab_polys in enumerate(pred_meshes):
    # iterate over the cortical and trabecular meshes
    for j, poly in enumerate(cort_trab_polys):
        armcortnet.write_polydata(p, f"scapula_{i}_{j}.ply")

Output Labels

The segmentation output contains the following labels:

  • 0: Background
  • 1: Other adjacent bones ("i.e clavicle, radius, ulna, etc.")
  • 2: Cortical region of bone of interest
  • 3: Trabecular region of bone of interest

Note: label 1 is removed when post-processing is used

Models

Trained models are automatically downloaded from HuggingFace Hub (gregspangenberg/armcortnet) on first use.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

armcortnet-0.4.3.tar.gz (14.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

armcortnet-0.4.3-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file armcortnet-0.4.3.tar.gz.

File metadata

  • Download URL: armcortnet-0.4.3.tar.gz
  • Upload date:
  • Size: 14.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.3 Linux/6.8.0-55-generic

File hashes

Hashes for armcortnet-0.4.3.tar.gz
Algorithm Hash digest
SHA256 ca631df41ad9c908133061da4a47c329edf9c9c82d6b8b4102ef53d3a108a8bc
MD5 5d78bd0dbcbd6374efa05326f6179dab
BLAKE2b-256 f73351391f83c7f66a6f84fcaf527255cae60e707b58c2de53d3905075dfcf41

See more details on using hashes here.

File details

Details for the file armcortnet-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: armcortnet-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 15.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.3 Linux/6.8.0-55-generic

File hashes

Hashes for armcortnet-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a2997a7714e150da51e586b013386825bc0adc103184af350ecc75925ca240e7
MD5 9b34f268aa6b81c3502b9dd150fcd100
BLAKE2b-256 d40f4545ba4f30d3c3dfab6527cdd988d1c60d2dd3086bd3facbec2eb5b347e4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page