Skip to main content

A Python package providing the MONet Bundle for nnUNet, extending it with MONAI functionalities and access to the MAIA Segmentation Portal.

Project description

MONet Bundle

Build

Documentation Status Version License Python

GitHub Release Date - Published_At GitHub contributors GitHub top language GitHub language count GitHub Workflow Status (with event) GitHub all releases PyPI - Downloads GitHub PyPI - License

GitHub repo size GitHub release (with filter) PyPI

This repository contains the implementation of the MONet Bundle, with some instructions on how to use it and how to convert a generic nnUNet model to MONAI Bundle format.

For more details about the MONet Bundle, please refer to the Jupyter notebook MONet_Bundle.ipynb.

2025-07-14 UPDATE: MAIA Segmentation Portal Released!

Curious to try the models through a user-friendly web interface? Explore the newly launched MAIA Segmentation Portal, where you can quickly upload your medical images and receive predictions in seconds. Prefer to keep your data local? No problem — simply download the models and run inference on your own machine with a single command!

2025-06-25 UPDATE: Check out the MONet Bundle for FedBraTS and FedLymphoma!

The MONet Bundle has been used in the Federated Brain Tumor Segmentation FedBraTS and Federated Lymphoma Segmentation FedLymphoma projects, which are described in the following paper:

Download the MONet Bundle

You can download the MONet Bundle from the following link: MONet Bundle ALternatively, you can use the following command to download the MONet Bundle:

wget https://raw.githubusercontent.com/SimoneBendazzoli93/MONet-Bundle/main/MONetBundle.zip

or, through the Python Script:

MONet_fetch_bundle.py --bundle_path <FOLDER_PATH>

Convert a trained nnUNet model to MONAI Bundle

To convert a trained nnUNet model to MONAI Bundle format, you can start with exporting a nnUNet trained model with the nnUNetv2_export_model_to_zip command. This command will export the model to a zip file that can be used in the conversion process.

nnUNetv2_export_model_to_zip -d 009 -o Task09_Spleen.zip -c 3d_fullres -tr nnUNetTrainer -p nnUNetPlans -chk checkpoint_final.pth checkpoint_best.pth --not_strict

For testing purposes, you can use the Task09_Spleen.zip file provided in this repository: https://github.com/SimoneBendazzoli93/nnUNet-MONAI-Bundle/releases/download/v1.0/Task09_Spleen.zip. This file contains a trained nnUNet model for the Spleen segmentation task, for only the 3d_fullres configuration and the fold 0.

Next, you can build the provided Docker image to convert the model to MONAI Bundle format. The Dockerfile is provided in this repository, and you can build the image with the following command:

docker build -t monet-bundle-converter .

The converter will first convert the nnUNet model to MONAI Bundle format, and then create the corresponding TorchScript model, which can be used for inference with MONAI Deploy. For testing purposes, you can use the Task09_Spleen.zip file provided in this repository.

To run the conversion, you can use the following command:

wget https://github.com/SimoneBendazzoli93/MONet-Bundle/releases/download/v1.0/Task09_Spleen.zip
MONet_run_conversion --bundle_path <MONAI_BUNDLE_PATH> --nnunet_model <NNUNET_CHECKPOINT_PATH>.zip --dataset_name_or_id <DATASET_NAME_OR_ID> --metadata_file <CUSTOM_METADATA_FILE>

Where:

  • <MONAI_BUNDLE_PATH> is the path where you want to save the MONet Bundle.
  • <NNUNET_CHECKPOINT_PATH> is the path to the nnUNet model checkpoint file (e.g., Task09_Spleen.zip).
  • <DATASET_NAME_OR_ID> is the name or ID of the dataset to convert (e.g., 09).
  • <CUSTOM_METADATA_FILE> is the path to a custom metadata file for the bundle.

Package the MONet Bundle with MONAI Deploy

To package the MONet Bundle with MONAI Deploy, you can use the monai-deploy package command. This command will create a deployable bundle that can be used for inference with MONAI Deploy.

monai-deploy package examples/apps/spleen_nnunet_seg_app -c examples/apps/spleen_nnunet_seg_app.yaml -t spleen:1.0 --platform x86_64

Run inference with MONAI Deploy

The resulting Docker context can be found in the deploy/spleen-x64-workstation-dgpu-linux-amd64:1.0 directory. You can use this context to build a Docker image that can be used for inference with MONAI Deploy:

# Copy the TorchScript model to the Docker context
cp MONetBundle/models/fold_0/model.ts deploy/spleen-x64-workstation-dgpu-linux-amd64:1.0/models/model/

docker build deploy/spleen-x64-workstation-dgpu-linux-amd64:1.0 --build-arg UID=1000 --build-arg GID=1000 --build-arg UNAME=holoscan -f deploy/spleen-x64-workstation-dgpu-linux-amd64:1.0/Dockerfile -t spleen-x64-workstation-dgpu-linux-amd64:1.0

To test the resulting Docker image, you can run:

MONet_inference_dicom --dicom_study_folder <INPUT_FOLDER> --prediction_output_folder <OUTPUT_DIR> --docker-image maiacloud/spleen-x64-workstation-dgpu-linux-amd64:1.0

Specifying the input and output folders, together with the TorchScript model path. The input folder should contain all the DICOM files of the study you want to process, and the output folder will contain the predictions in DICOM SEG format, and an additional STL file with the 3D mesh of the segmentation.

To create the same Docker image running inference on NIFTI images, you can use the provided Dockerfile in the deploy/spleen-x64-workstation-dgpu-linux-amd64:1.0-nifti directory. The Dockerfile is already set up to run inference on NIfTI images, and it includes the necessary dependencies. To test the resulting Docker image, you can run:

MONet_inference_nifti --study_folder <INPUT_FOLDER> --prediction_output_folder <OUTPUT_DIR> --docker-image maiacloud/spleen-x64-workstation-dgpu-linux-amd64:1.0-nifti

Specifying the input and output folders, together with the TorchScript model path. The input folder should contain all the NIfTI files of the study you want to process (one per modality, with the given suffix identifier), and the output folder will contain the predictions in NIfTI format.

Run MONAI Label server with MONet Bundle

To run a MONAI Label server with the MONet Bundle, you can use the MONet_MONAI_Label.py script provided in this repository. This script will run a MONAI Label server with the MONet Bundle model, allowing you to interactively segment medical images using the trained model. You can run the script with the following command:

MONet_MONAI_Label --docker-image spleen-x64-workstation-dgpu-linux-amd64:1.0-label --model_folder MONetBundle/models/fold_0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

monet_bundle-1.3.tar.gz (290.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

monet_bundle-1.3-py3-none-any.whl (278.2 kB view details)

Uploaded Python 3

File details

Details for the file monet_bundle-1.3.tar.gz.

File metadata

  • Download URL: monet_bundle-1.3.tar.gz
  • Upload date:
  • Size: 290.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for monet_bundle-1.3.tar.gz
Algorithm Hash digest
SHA256 0fb3a4d8ce310d9480c6b380dd8c386eeb48be38c468a069678d5232acace80d
MD5 cae2f89421d185027a0a130b5f1908b9
BLAKE2b-256 88554a52977b0b7c3042b1ab097f81009a5db68e45a6283fa61b9d677e5878a2

See more details on using hashes here.

File details

Details for the file monet_bundle-1.3-py3-none-any.whl.

File metadata

  • Download URL: monet_bundle-1.3-py3-none-any.whl
  • Upload date:
  • Size: 278.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for monet_bundle-1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 db32c03d7fb877e96a17838fb22287cdbc75cc810cf8da7e8d5b3756da75f8f9
MD5 c3eaa8997fbb29472a12ec21ab7d75ed
BLAKE2b-256 a63511bc237a61e04d8a3ec24b4601e005c4158fc142d497a4c515feceecd122

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page