Skip to main content

Raidionics segmentation and classification back-end with ONNX runtime

Project description

Raidionics backend for segmentation and classification

License Paper codecov PyPI version

The code corresponds to the segmentation or classification backend of MRI/CT volumes, using ONNX runtime for inference.
The module can either be used as a Python library, as CLI, or as Docker container. By default, inference is performed on CPU only.

Installation

pip install raidionicsseg
(or)
pip install git+https://github.com/dbouget/raidionics_seg_lib.git

Additional packages (i.e., torch and onnxruntime-gpu), if needed, can also be installed as follows:

pip instal raidionicsseg[ort-gpu]
pip instal raidionicsseg[torch]

How to cite

If you are using Raidionics in your research, please cite the following references.

The final software including updated performance metrics for preoperative tumors and introducing postoperative tumor segmentation:

@article{bouget2023raidionics,
    author = {Bouget, David and Alsinan, Demah and Gaitan, Valeria and Holden Helland, Ragnhild and Pedersen, André and Solheim, Ole and Reinertsen, Ingerid},
    year = {2023},
    month = {09},
    pages = {},
    title = {Raidionics: an open software for pre-and postoperative central nervous system tumor segmentation and standardized reporting},
    volume = {13},
    journal = {Scientific Reports},
    doi = {10.1038/s41598-023-42048-7},
}

For the preliminary preoperative tumor segmentation validation and software features:

@article{bouget2022preoptumorseg,
    title={Preoperative Brain Tumor Imaging: Models and Software for Segmentation and Standardized Reporting},
    author={Bouget, David and Pedersen, André and Jakola, Asgeir S. and Kavouridis, Vasileios and Emblem, Kyrre E. and Eijgelaar, Roelant S. and Kommers, Ivar and Ardon, Hilko and Barkhof, Frederik and Bello, Lorenzo and Berger, Mitchel S. and Conti Nibali, Marco and Furtner, Julia and Hervey-Jumper, Shawn and Idema, Albert J. S. and Kiesel, Barbara and Kloet, Alfred and Mandonnet, Emmanuel and Müller, Domenique M. J. and Robe, Pierre A. and Rossi, Marco and Sciortino, Tommaso and Van den Brink, Wimar A. and Wagemakers, Michiel and Widhalm, Georg and Witte, Marnix G. and Zwinderman, Aeilko H. and De Witt Hamer, Philip C. and Solheim, Ole and Reinertsen, Ingerid},
    journal={Frontiers in Neurology},
    volume={13},
    year={2022},
    url={https://www.frontiersin.org/articles/10.3389/fneur.2022.932219},
    doi={10.3389/fneur.2022.932219},
    issn={1664-2295}
}

Usage

1. CLI

raidionicsseg CONFIG

CONFIG should point to a configuration file (*.ini), specifying all runtime parameters, according to the pattern from blank_main_config.ini.

2. Python module

from raidionicsseg import run_model
run_model(config_filename="/path/to/main_config.ini")

3. Docker

When calling Docker images, the --user flag must be properly used in order for the folders and files created inside the container to inherit the proper read/write permissions. The user ID is retrieved on-the-fly in the following examples, but it can be given in a more hard-coded fashion if known by the user.

:warning: The following Docker image can only perform inference using the CPU. Another Docker image has been created, able to leverage the GPU (see further down below). If the CUDA version does not match your machine, a new Docker image can be built manually, simply modifying the base torch image to pull from inside Dockerfile_gpu.

docker pull dbouget/raidionics-segmenter:v1.5-py39-cpu

For opening the Docker image and interacting with it, run:

docker run --entrypoint /bin/bash -v /home/<username>/<resources_path>:/workspace/resources -t -i --network=host --ipc=host --user $(id -u) dbouget/raidionics-segmenter:v1.5-py39-cpu

The /home/<username>/<resources_path> before the column sign has to be changed to match a directory on your local machine containing the data to expose to the docker image. Namely, it must contain folder(s) with images you want to run inference on, as long as a folder with the trained models to use, and a destination folder where the results will be placed.

For launching the Docker image as a CLI, run:

docker run -v /home/<username>/<resources_path>:/workspace/resources -t -i --network=host --ipc=host --user $(id -u) dbouget/raidionics-segmenter:v1.5-py39-cpu -c /workspace/resources/<path>/<to>/main_config.ini -v <verbose>

The <path>/<to>/main_config.ini must point to a valid configuration file on your machine, as a relative path to the /home/<username>/<resources_path> described above. For example, if the file is located on my machine under /home/myuser/Data/Segmentation/main_config.ini, and that /home/myuser/Data is the mounted resources partition mounted on the Docker image, the new relative path will be Segmentation/main_config.ini.
The <verbose> level can be selected from [debug, info, warning, error].

For running models on the GPU inside the Docker image, run the following CLI, with the gpu_id properly filled in the configuration file:

docker run -v /home/<username>/<resources_path>:/workspace/resources -t -i --runtime=nvidia --network=host --ipc=host --user $(id -u) dbouget/raidionics-segmenter:v1.5-py39-cuda12.4 -c /workspace/resources/<path>/<to>/main_config.ini -v <verbose>

Models

The trained models are automatically downloaded when running Raidionics or Raidionics-Slicer.
Alternatively, all existing Raidionics models can be browsed here directly.

Developers

For running inference on GPU, your machine must be properly configured (cf. here)
In the configuration file, the gpu_id parameter should then point to the GPU that is to be used during inference. The onnxruntime-gpu Python package must be installed in addition, with a version matching the driver and cuda version, more info can be accessed here

To run the unit tests, type the following within your virtual environment and within the raidionics_seg_lib folder:

pip install pytest
pytest tests/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

raidionicsseg-1.5.0.tar.gz (34.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

raidionicsseg-1.5.0-py3-none-any.whl (41.4 kB view details)

Uploaded Python 3

File details

Details for the file raidionicsseg-1.5.0.tar.gz.

File metadata

  • Download URL: raidionicsseg-1.5.0.tar.gz
  • Upload date:
  • Size: 34.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for raidionicsseg-1.5.0.tar.gz
Algorithm Hash digest
SHA256 03a11f78a34b2e0b4d2224cb60ce312c3307856498268bca659b037a24273477
MD5 e7a1136e93804dfbccf0d01bcf613d88
BLAKE2b-256 4e361924ddc399a0204f51ad45b5a0e2a82351e763a1f6f700fc3fd64d1ab859

See more details on using hashes here.

File details

Details for the file raidionicsseg-1.5.0-py3-none-any.whl.

File metadata

  • Download URL: raidionicsseg-1.5.0-py3-none-any.whl
  • Upload date:
  • Size: 41.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for raidionicsseg-1.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d86a6098b00dfef8389f068e63c77fa6d10aebf44cb77dfa6a8cc8b970c31ac9
MD5 d7fb29341d516995b74d2dbf93471bcf
BLAKE2b-256 96cd4074b7a364b0a9fee21d272f5b27a7d80955a314a58a464c4f4e8182a17e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page