Skip to main content

Model API: model wrappers and pipelines for inference with OpenVINO

Project description

Python* Model API package

Model API package is a set of wrapper classes for particular tasks and model architectures, simplifying data preprocess and postprocess as well as routine procedures (model loading, asynchronous execution, etc...) An application feeds model class with input data, then the model returns postprocessed output data in user-friendly format.

Package structure

The Model API consists of 3 libraries:

  • adapters implements a common interface to allow Model API wrappers usage with different executors. See Model API adapters section
  • models implements wrappers for Open Model Zoo models. See Model API Wrappers section
  • pipelines implements pipelines for model inference and manage the synchronous/asynchronous execution. See Model API Pipelines section

Prerequisites

The package requires

  • one of OpenVINO supported Python version (see OpenVINO documentation for the details)
  • OpenVINO™ toolkit

If you build Model API package from source, you should install the OpenVINO™ toolkit. See the options:

Use installation package for Intel® Distribution of OpenVINO™ toolkit or build the open-source version available in the OpenVINO GitHub repository using the build instructions.

Also, you can install the OpenVINO Python* package via the command:

pip install openvino

Installing Python* Model API package

Use the following command to install Model API from source:

pip install <omz_dir>/demos/common/python

Alternatively, you can generate the package using a wheel. Follow the steps below:

  1. Build the wheel.
python <omz_dir>/demos/common/python/setup.py bdist_wheel

The wheel should appear in the dist folder. Name example: openmodelzoo_modelapi-0.0.0-py3-none-any.whl

  1. Install the package in the clean environment with --force-reinstall key.
pip install openmodelzoo_modelapi-0.0.0-py3-none-any.whl --force-reinstall

To verify the package is installed, you might use the following command:

python -c "from openvino.model_zoo import model_api"

Model API Wrappers

The Model API package provides model wrappers, which implement standardized preprocessing/postprocessing functions per "task type" and encapsulate model-specific logic for usage of different models in a unified manner inside the application.

The following tasks can be solved with wrappers usage:

Task type Model API wrappers
Classification
  • ClassificationModel
Human Pose Estimation
  • KeypointDetectionModel
Instance Segmentation
  • MaskRCNNModel
Object Detection
  • SSD
  • YOLO
  • YoloV3ONNX
  • YoloV4
  • YOLOF
  • YOLOX
Semantic Segmentation
  • SegmentationModel
Visual Prompting
  • SAMDecoder
  • SAMImageEncoder
Action Classification
  • ActionClassificationModel

Model API Adapters

Model API wrappers are executor-agnostic, meaning it does not implement the specific model inference or model loading, instead it can be used with different executors having the implementation of common interface methods in adapter class respectively.

Currently, OpenvinoAdapter and OVMSAdapter are supported.

OpenVINO Adapter

OpenvinoAdapter hides the OpenVINO™ toolkit API, which allows Model API wrappers launching with models represented in Intermediate Representation (IR) format. It accepts a path to either xml model file or onnx model file.

OpenVINO Model Server Adapter

OVMSAdapter hides the OpenVINO Model Server python client API, which allows Model API wrappers launching with models served by OVMS.

Refer to OVMSAdapter to learn about running demos with OVMS.

For using OpenVINO Model Server Adapter you need to install the package with extra module:

pip install <omz_dir>/demos/common/python[ovms]

ONNXRuntime Adapter

ONNXRuntimeAdapter hides the ONNXRuntime, which Model API wrappers launching with models represented in ONNX format. It accepts a path to onnx file. This adapter's functionality is limited: it doesn't support model reshaping, asynchronous inference and was tested only on limited scope of models. Supported model wrappers: SSD, MaskRCNNModel, SegmentationModel, and ClassificationModel.

To use this adapter, install extra dependencies:

pip install onnx onnxruntime

Model API Pipelines

Model API Pipelines represent the high-level wrappers upon the input data and accessing model results management. They perform the data submission for model inference, verification of inference status, whether the result is ready or not, and results accessing.

The AsyncPipeline is available, which handles the asynchronous execution of a single model.

Ready-to-use Model API solutions

To apply Model API wrappers in custom applications, learn the provided example of common scenario of how to use Model API.

In the example, the SSD architecture is used to predict bounding boxes on input image "sample.png". The model execution is produced by OpenvinoAdapter, therefore we submit the path to the model's xml file.

Once the SSD model wrapper instance is created, we get the predictions by the model in one line: ssd_model(input_data) - the wrapper performs the preprocess method, synchronous inference on OpenVINO™ toolkit side and postprocess method.

import cv2
# import model wrapper class
from model_api.models import SSD
# import inference adapter and helper for runtime setup
from model_api.adapters import OpenvinoAdapter, create_core


# read input image using opencv
input_data = cv2.imread("sample.png")

# define the path to mobilenet-ssd model in IR format
model_path = "public/mobilenet-ssd/FP32/mobilenet-ssd.xml"

# create adapter for OpenVINO™ runtime, pass the model path
inference_adapter = OpenvinoAdapter(create_core(), model_path, device="CPU")

# create model API wrapper for SSD architecture
# preload=True loads the model on CPU inside the adapter
ssd_model = SSD(inference_adapter, preload=True)

# apply input preprocessing, sync inference, model output postprocessing
results = ssd_model(input_data)

To study the complex scenarios, refer to Open Model Zoo Python* demos, where the asynchronous inference is applied.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openvino_model_api-0.3.0.4.tar.gz (86.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openvino_model_api-0.3.0.4-py3-none-any.whl (115.0 kB view details)

Uploaded Python 3

File details

Details for the file openvino_model_api-0.3.0.4.tar.gz.

File metadata

  • Download URL: openvino_model_api-0.3.0.4.tar.gz
  • Upload date:
  • Size: 86.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for openvino_model_api-0.3.0.4.tar.gz
Algorithm Hash digest
SHA256 c562b9bdd34b717ae258e0e9d218a419479c42844b661562036026e5b774a670
MD5 aecf43055c8126cd280ae4570376b385
BLAKE2b-256 f143c55daad9d732f8461244a9a60722ebf911f405ec4a56ce787f261580b13d

See more details on using hashes here.

Provenance

The following attestation bundles were made for openvino_model_api-0.3.0.4.tar.gz:

Publisher: publish.yaml on open-edge-platform/model_api

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file openvino_model_api-0.3.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for openvino_model_api-0.3.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 bd132f68fa6f4affc2420bbea80c3b0af2e9b11657058ffab46d6c46915badb3
MD5 cd1273e08df0a06f9ee9a1ffad337000
BLAKE2b-256 20fb92c3a865a8987b215179ab7d23a0b97600b32ef54bad301c37ac2fcdc174

See more details on using hashes here.

Provenance

The following attestation bundles were made for openvino_model_api-0.3.0.4-py3-none-any.whl:

Publisher: publish.yaml on open-edge-platform/model_api

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page