Skip to main content

Python client for OpenVINO Model Server

Project description

OpenVINO™ Model Server Client

OpenVINO™ Model Server Client package makes the interaction with the model server easy. It is very lightweight thanks to minimal number of included dependencies. The total size of the package, along with all dependencies is less than 100 MB.

The ovmsclient package works both with OpenVINO™ Model Server and TensorFlow Serving. It supports both gRPC and REST API calls: Predict, GetModelMetadata and GetModelStatus.

The ovmsclient can replace tensorflow-serving-api package with reduced footprint and simplified interface.

See API reference for usage details.

Usage example

import ovmsclient

# Create connection to the model server
client = ovmsclient.make_grpc_client("localhost:9000")

# Get model metadata to learn about model inputs
model_metadata = client.get_model_metadata(model_name="model")

# If model has only one input, get its name like that
input_name = next(iter(model_metadata["inputs"]))

# Read the image file
with open("path/to/img.jpg", 'rb') as f:
    img = f.read()

# Place the data in a dict, along with model input name
inputs = {input_name: img}

# Run prediction and wait for the result
results = client.predict(inputs=inputs, model_name="model")

Learn more on ovmsclient documentation site.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

ovmsclient-2022.1-py3-none-any.whl (155.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page