Skip to main content

Generated gRPC client library for the open-inference-protocol

Project description

Open Inference Protocol gRPC Client

Package version

open-inference-grpc is a generated client library based on the gRPC protocol definition tracked in the open-inference/open-inference-protocol/ repository.


Installation

This package requires Python 3.8 or greater.

Install with your favorite tool from pypi.org/project/open-inference-grpc/

$ pip install open-inference-grpc
$ poetry add open-inference-grpc

A REST-based python client (open-inference-openapi) also exists for the Open Inference Protocol, and can be installed alongside this gRPC client, as both are distributed as namespace packages.

Example

# These dependencies are installed by open-inference-grpc
import grpc
from google.protobuf.json_format import MessageToDict

from open_inference.grpc.service import GRPCInferenceServiceStub
from open_inference.grpc.protocol import (
    ServerReadyRequest,
    ModelReadyRequest,
    ModelMetadataRequest,
    ModelInferRequest,
)


with grpc.insecure_channel("localhost:8081") as channel:
    client = GRPCInferenceServiceStub(channel)

    # Check that the server is live, and it has the iris model loaded
    client.ServerReady(ServerReadyRequest())
    client.ModelReady(ModelReadyRequest(name="iris-model"))

    # Make an inference request
    pred = client.ModelInfer(
        ModelInferRequest(
            model_name="iris-model",
            inputs=[
                {
                    "name": "input-0",
                    "datatype": "FP64",
                    "shape": [1, 4],
                    "contents": {"fp64_contents": [5.3, 3.7, 1.5, 0.2]},
                }
            ],
        )
    )

print(MessageToDict(pred))
# {
#     "modelName": "iris-model",
#     "parameters": {"content_type": {"stringParam": "np"}},
#     "outputs": [
#         {
#             "name": "output-1",
#             "datatype": "INT64",
#             "shape": ["1", "1"],
#             "parameters": {"content_type": {"stringParam": "np"}},
#             "contents": {"int64Contents": ["0"]},
#         }
#     ],
# }
Async versions of the same APIs are also available, use grpc.aio instead to create a channel then await and requests made.
async with grpc.aio.insecure_channel('localhost:8081') as channel:
    stub = GRPCInferenceServiceStub(channel)
    await stub.ServerReady(ServerReadyRequest())

Dependencies

The open-inference-grpc python package relies only on grpcio, the underlying transport implementation of gRPC.

Contribute

This client is largely generated automatically by grpc-tools, with a small amount of build post-processing in build.py.

Run python build.py to build this package, it will:

  1. If proto/open_inference_grpc.proto is not found, download it from open-inference/open-inference-protocol/
  2. Run grpcio_tools.protoc to create the python client
  3. Postprocess filenames and imports
  4. Prepend the Apache 2.0 License preamble
  5. Format with black

If you want to contribute to the open-inference-protocol itself, please create an issue or PR in the open-inference/open-inference-protocol repository.

License

By contributing to Open Inference Protocol Python client repository, you agree that your contributions will be licensed under its Apache 2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_inference_grpc-2.0.0a1.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

open_inference_grpc-2.0.0a1-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file open_inference_grpc-2.0.0a1.tar.gz.

File metadata

  • Download URL: open_inference_grpc-2.0.0a1.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.8.10 Linux/5.15.133.1-microsoft-standard-WSL2

File hashes

Hashes for open_inference_grpc-2.0.0a1.tar.gz
Algorithm Hash digest
SHA256 6aa3933c0b5b2bad7ea79cb477ed1bdd26c047cddd824a001aa569703dce834b
MD5 509a0d4398bc44a56e2ab9aa94b8b00f
BLAKE2b-256 5cd36f9c33d7f26927a1897eca9b69d7f7f555ebf2db560437046b220cdf57d6

See more details on using hashes here.

File details

Details for the file open_inference_grpc-2.0.0a1-py3-none-any.whl.

File metadata

  • Download URL: open_inference_grpc-2.0.0a1-py3-none-any.whl
  • Upload date:
  • Size: 9.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.8.10 Linux/5.15.133.1-microsoft-standard-WSL2

File hashes

Hashes for open_inference_grpc-2.0.0a1-py3-none-any.whl
Algorithm Hash digest
SHA256 76f1e241315750cb18622fa77e71783eafd401e7bcdbffe7fb67300a8ae916ae
MD5 d49cfd2cedd7abb3a59871c4ca5cd403
BLAKE2b-256 277083d61b0d0784df2c601c9bcdd12a45461d4f6180dfc166b5df693690796e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page