Skip to main content

An automated deep learning pipeline for segmentation of the scapula, humerus, and their respective subregions in CT scans.

Project description

armcortnet

PyPI Latest Release Code style: black

Armcortnet provides automatic segmentation of the humerus and scapula from CT scans. The deep learning model is trained to also segment out the cortical and trabecular subregions from each bone as well.

The deep learning pipeple consists of using armcrop to crop to an oriented bounding box around each humerus or scapula in the image and then a neural network based traine from the nnUNet framework segments that cropped volume. The segmetnation is then transformed back to the original coordinate system, post-processed and finally saved as a .seg.nrrd file.

Installation

Please install pytorch first before installing armcortnet. You can learn about installing pytorch from the official website here.

Then install armcortnet using pip:

pip install armcortnet

For faster oriented bounding box cropping you can replace onnxruntime with onnxruntime-gpu.

Usage

To generate a segmentation of the humerus or scapula from a CT volume, use the following:

import armcortnet
import SimpleITK as sitk

# initialize the segmentation model
model = armcortnet.Net(bone_type="scapula")  # or "humerus"

# perform segmentation prediction on a CT volume
pred_segmentations = model.predict(
    vol_path="path/to/input/ct.nrrd"
)
# output is a list of SimpleITK images, one for each bone_type detected in the CT
for i, pred_seg in enumerate(pred_segmentations):
    # write each of the segmentations to the disk
    sitk.WriteImage(pred_seg, f"scapula-{i}.seg.nrrd")

A mesh of the predicted bone can be generated using the following:

# perform mesh prediction on a CT volume, returns list of vtkPolyData objects
pred_meshes = model.predict_poly(
    vol_path="path/to/input/ct.nrrd"
)

# iterate over each detected object
for i, cort_trab_polys in enumerate(pred_meshes):
    # iterate over the cortical and trabecular meshes
    for j, poly in enumerate(cort_trab_polys):
        armcortnet.write_polydata(p, f"scapula_{i}_{j}.ply")

Output Labels

The segmentation output contains the following labels:

  • 0: Background
  • 1: Other adjacent bones ("i.e clavicle, radius, ulna, etc.")
  • 2: Cortical region of bone of interest
  • 3: Trabecular region of bone of interest

Note: label 1 is removed when post-processing is used

Models

Trained models are automatically downloaded from HuggingFace Hub (gregspangenberg/armcortnet) on first use.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

armcortnet-0.7.0.tar.gz (15.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

armcortnet-0.7.0-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file armcortnet-0.7.0.tar.gz.

File metadata

  • Download URL: armcortnet-0.7.0.tar.gz
  • Upload date:
  • Size: 15.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.7 Linux/6.11.0-29-generic

File hashes

Hashes for armcortnet-0.7.0.tar.gz
Algorithm Hash digest
SHA256 efa3b5887b573bdb28dc4756c2a1dbde61d59b5355dc28ce1f30335d7dcb113b
MD5 3c611cdd1de871fdeb9db27f3ebd0894
BLAKE2b-256 801e88078ac76c2d656b60f60d2b6e35d8e2f30188594d247c1bb9e3e97f887c

See more details on using hashes here.

File details

Details for the file armcortnet-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: armcortnet-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.0.1 CPython/3.12.7 Linux/6.11.0-29-generic

File hashes

Hashes for armcortnet-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 452039074ff4fe66575a169198aded32ded1fb72331df09d8a95c8486d815e94
MD5 c2675782fcb50661912f155108c72c34
BLAKE2b-256 ba8de8154075ecd47e5a10d2860d056dd9160fb064b29dbfa382b47d22bbbdc8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page