Skip to main content

Streaming Inference Extension for MONAI

Project description

MONAI Stream

License CI Build Documentation Status codecov

MONAI Stream SDK aims to equip experienced MONAI Researchers an Developers with the ability to build streaming inference pipelines while enjoying the familiar MONAI development experience and utilities.

MONAI Stream pipelines begin with a source component, and end with a sink component, and the two are connected by a series of filter components as shown below.

MONAIStreamArchitecture

MONAI Stream SDK natively supports:

  • a number of input component types including real-time streams (RTSP), streaming URL, local video files,
    AJA Capture cards with direct memory access to GPU, and a Fake Source for testing purposes,
  • outputs components to allow the developer to view the result of their pipelines or just to test via Fake Sink,
  • a number of filter types, including format conversion, video frame resizing and/or scaling, and most importantly a MONAI transform components that allows developers to plug-in MONAI transformations into the MONAI Stream pipeline.
  • Clara AGX Developer Kit in dGPU configuration.

The diagram below shows a visualization of a MONAI Stream pipeline where a URISource is chained to video conversion, inference service, and importantly to TransformChainComponent which allows MONAI transformations (or any compatible callables that accept Dict[str, torch.Tensor]) to be plugged into the MONAI Stream pipeline. The results are then vizualized on the screen via NVEglGlesSink.

In the conceptual example pipeline above, NVInferServer passes both the original image as well as all the inference model outputs to the transform chain component. The developer may choose to manipulate the two pieces of data separately or together to create the desired output for display.

TransformChainComponent presents MONAI transforms with torch.Tensor data containing a single frame of the video stream. Implementationally, TransformChainComponent provides a compatibility layer between MONAI and the underlying DeepStream SDK backbone, so MONAI developers may be able to plug-in existing MONAI inference code into DeepStream.

Features

The codebase is currently under active development.

  • Framework to allow MONAI-style inference pipelines for streaming data.
  • Allows for MONAI chained transformations to be used on streaming data.
  • Inference models can be used natively in MONAI or deployed via Triton Inference Server.
  • Natively provides support for x86 and Clara AGX architectures
    • with the future aim to allow developers to deploy the same code in both architectures with no changes.

Getting Started: x86 Development Container Setup

Creating a Local Development Container

To build a developer container for your workstation simply clone the repo and run the setup script as follows.

# clone the latest release from the repo
git clone -b <release_tag> https://github.com/Project-MONAI/MONAIStream

# start development setup script
cd MONAIStream
./start_devel.sh

With the successful completion of the setup script, a container will be running containing all the necessary libraries for the developer to start designing MONAI Stream SDK inference pipelines. The development however is limited to within the container and the mounted volumes. The developer may modify Dockerfile.devel and start_devel.sh to suit their needs.

Connecting VSCode to the Development Container

To start developing within the newly created MONAI Stream SDK development container users may choose to use their favorite editor or IDE. Here, we show how one could setup VSCode on their local machine to start developing MONAI Stream inference pipelines.

  1. Install VSCode on your Linux development workstation.
  2. Install the Remote Development Extension pack and restart VSCode.
  3. In VSCode select the icon VSCodeRDE of the newly installed Remote Development extension on the left.
  4. Select "Containers" under "Remote Explorer" at the top of the dialog. VSCodeRemoteExplorer
  5. Attach to the MONAI Stream SDK container by clicking the "Attach to Container" icon VSCodeAttachContainer on the container name.

The above steps should allow the user to develop inside the MONAI Stream container using VSCode.

Run the Ultrasound Inference Sample App

MONAI Stream SDK comes with example inference pipelines. Here, we run a sample app to perform instrument segmentation in an ultrasound video.

Inside the development container perform the following steps.

  1. Download the ultrasound data and models in the container.
mkdir -p /app/data
cd /app/data
wget https://github.com/Project-MONAI/MONAIStream/releases/download/data/US.zip
unzip US.zip -d .
  1. Copy the ultrasound video to /app/videos/Q000_04_tu_segmented_ultrasound_256.avi as the example app expects.
mkdir -p /app/videos
cp /app/data/US/Q000_04_tu_segmented_ultrasound_256.avi /app/videos/.
  1. Convert PyTorch or ONNX model to TRT engine.

    a. To Convert the provided ONNX model to a TRT engine use:

    cd /app/data/US/
    /usr/src/tensorrt/bin/trtexec --onnx=us_unet_256x256.onnx --saveEngine=model.engine --explicitBatch --verbose --workspace=1000
    

    b. To convert the PyTorch model to a TRT engine use:

    cd /app/data/US/
    monaistream convert -i us_unet_jit.pt -o monai_unet.engine -I INPUT__0 -O OUTPUT__0 -S 1 3 256 256
    
  2. Copy the ultrasound segmentation model under /app/models/monai_unet_trt/1 as our sample app expects.

mkdir -p /app/models/monai_unet_trt/1
cp /app/data/US/monai_unet.engine /app/models/monai_unet_trt/1/.
cp /app/data/US/config_us_trt.pbtxt /app/models/monai_unet_trt/config.pbtxt
  1. Now we are ready to run the example streaming ultrasound bone scoliosis segmentation pipeline.
cd /sample/monaistream-pytorch-pp-app
python main.py

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

monaistream-0.1.1-py3-none-any.whl (42.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page