Skip to main content

Generated gRPC client library for the open-inference-protocol

Project description

Open Inference Protocol gRPC Client

Package version

open-inference-grpc is a generated client library based on the gRPC protocol definition tracked in the open-inference/open-inference-protocol/ repository.


Installation

This package requires Python 3.8 or greater.

Install with your favorite tool from pypi.org/project/open-inference-grpc/

$ pip install open-inference-grpc
$ poetry add open-inference-grpc

A REST-based python client (open-inference-openapi) also exists for the Open Inference Protocol, and can be installed alongside this gRPC client, as both are distributed as namespace packages.

Example

# These dependencies are installed by open-inference-grpc
import grpc
from google.protobuf.json_format import MessageToDict

from open_inference.grpc.service import GRPCInferenceServiceStub
from open_inference.grpc.protocol import (
    ServerReadyRequest,
    ModelReadyRequest,
    ModelMetadataRequest,
    ModelInferRequest,
)


with grpc.insecure_channel("localhost:8081") as channel:
    client = GRPCInferenceServiceStub(channel)

    # Check that the server is live, and it has the iris model loaded
    client.ServerReady(ServerReadyRequest())
    client.ModelReady(ModelReadyRequest(name="iris-model"))

    # Make an inference request
    pred = client.ModelInfer(
        ModelInferRequest(
            model_name="iris-model",
            inputs=[
                {
                    "name": "input-0",
                    "datatype": "FP64",
                    "shape": [1, 4],
                    "contents": {"fp64_contents": [5.3, 3.7, 1.5, 0.2]},
                }
            ],
        )
    )

print(MessageToDict(pred))
# {
#     "modelName": "iris-model",
#     "parameters": {"content_type": {"stringParam": "np"}},
#     "outputs": [
#         {
#             "name": "output-1",
#             "datatype": "INT64",
#             "shape": ["1", "1"],
#             "parameters": {"content_type": {"stringParam": "np"}},
#             "contents": {"int64Contents": ["0"]},
#         }
#     ],
# }
Async versions of the same APIs are also available, use grpc.aio instead to create a channel then await and requests made.
async with grpc.aio.insecure_channel('localhost:8081') as channel:
    stub = GRPCInferenceServiceStub(channel)
    await stub.ServerReady(ServerReadyRequest())

Dependencies

The open-inference-grpc python package relies only on grpcio, the underlying transport implementation of gRPC.

Contribute

This client is largely generated automatically by grpc-tools, with a small amount of build post-processing in build.py.

Run python build.py to build this package, it will:

  1. If proto/open_inference_grpc.proto is not found, download it from open-inference/open-inference-protocol/
  2. Run grpcio_tools.protoc to create the python client
  3. Postprocess filenames and imports
  4. Prepend the Apache 2.0 License preamble
  5. Format with black

If you want to contribute to the open-inference-protocol itself, please create an issue or PR in the open-inference/open-inference-protocol repository.

License

By contributing to Open Inference Protocol Python client repository, you agree that your contributions will be licensed under its Apache 2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_inference_grpc-2.0.0.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

open_inference_grpc-2.0.0-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file open_inference_grpc-2.0.0.tar.gz.

File metadata

  • Download URL: open_inference_grpc-2.0.0.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.8.10 Linux/5.15.133.1-microsoft-standard-WSL2

File hashes

Hashes for open_inference_grpc-2.0.0.tar.gz
Algorithm Hash digest
SHA256 ab71943730309a6bb19677d57ea64f1824799cfbdbe5adaed7515852716a7ae8
MD5 24163048f980dec5c9fdf9d149ba02e4
BLAKE2b-256 937a9066af152ef0d2ab34e801f65ebedc5474b020e3d3d01c94d14e1faf918d

See more details on using hashes here.

File details

Details for the file open_inference_grpc-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: open_inference_grpc-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 9.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.8.10 Linux/5.15.133.1-microsoft-standard-WSL2

File hashes

Hashes for open_inference_grpc-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 73679c9b511dea1224ebecf2d6e944b6cf039f3ba24a66d1524870da47f5c071
MD5 05b247f4b3633043d3b41c26af799beb
BLAKE2b-256 c4fbe37ca9bee512bc0c1e61a930206886dcb4dd1dce50160a30f8760103d887

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page