Skip to main content

Generated OpenAPI client library for the open-inference-protocol

Project description

Open Inference Protocol OpenAPI Client

Package version

open-inference-openapi is a generated client library based on the OpenAPI protocol definition tracked in the open-inference/open-inference-protocol/ repository.


Installation

This package requires Python 3.8 or greater.

Install with your favorite tool from pypi.org/project/open-inference-openapi/

$ pip install open-inference-openapi
$ poetry add open-inference-openapi

A gRPC-based python client (open-inference-grpc) also exists for the Open Inference Protocol, and can be installed alongside this gRPC client, as both are distributed as namespace packages.

Example

from open_inference.openapi.client import OpenInferenceClient, InferenceRequest

client = OpenInferenceClient(base_url='http://localhost:5002')

# Check that the server is live, and it has the iris model loaded
client.check_server_readiness()
client.read_model_metadata('mlflow-model')

# Make an inference request with two examples
pred = client.model_infer(
    "mlflow-model",
    request=InferenceRequest(
        inputs=[
            {
                "name": "input",
                "shape": [2, 4],
                "datatype": "FP64",
                "data": [
                    [5.0, 3.3, 1.4, 0.2],
                    [7.0, 3.2, 4.7, 1.4],
                ],
            }
        ]
    ),
)

print(repr(pred))
# InferenceResponse(
#     model_name="mlflow-model",
#     model_version=None,
#     id="580c30e3-f835-418f-bb17-a3074d42ad21",
#     parameters={"content_type": "np", "headers": None},
#     outputs=[
#         ResponseOutput(
#             name="output-1",
#             shape=[2, 1],
#             datatype="INT64",
#             parameters={"content_type": "np", "headers": None},
#             data=TensorData(__root__=[0.0, 1.0]),
#         )
#     ],
# )
Async versions of the same APIs are also available. Import AsyncOpenInfereClient instead, then await and requests made.
from open_inference.openapi.client import AsyncOpenInferenceClient

client = AsyncOpenInferenceClient(base_url="http://localhost:5002")
await client.check_server_readiness()

Dependencies

The open-inference-openapi python package relies on:

  • pydantic - Message formatting, structure, and validation.
  • httpx - Implementation of the underlying HTTP transport.

Contribute

This client is largely generated automatically by fern, with a small amount of build post-processing in build.py.

Run python build.py to build this package, it will:

  1. If fern/openapi/open_inference_rest.yaml is not found, download it from open-inference/open-inference-protocol/
  2. Run fern generate to create the python client (fern-api must be installed npm install --global fern-api)
  3. Postprocess to correctly implement the recursive TensorData model.
  4. Prepend the Apache 2.0 License preamble
  5. Format with black

If you want to contribute to the open-inference-protocol itself, please create an issue or PR in the open-inference/open-inference-protocol repository.

License

By contributing to Open Inference Protocol Python client repository, you agree that your contributions will be licensed under its Apache 2.0 License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_inference_openapi-2.0.0.1.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

open_inference_openapi-2.0.0.1-py3-none-any.whl (29.4 kB view details)

Uploaded Python 3

File details

Details for the file open_inference_openapi-2.0.0.1.tar.gz.

File metadata

  • Download URL: open_inference_openapi-2.0.0.1.tar.gz
  • Upload date:
  • Size: 11.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.11.8 Linux/5.15.146.1-microsoft-standard-WSL2

File hashes

Hashes for open_inference_openapi-2.0.0.1.tar.gz
Algorithm Hash digest
SHA256 aa2fbc2650d95d81957cb120e3ad37ef3d97ee4c2df7fb60cda2a0dce792555e
MD5 bd5360ff4fc00afe877ad0b644e7477a
BLAKE2b-256 f11d7afe42c6e77bebd07b2f1c80d29ec3bc20d9488a945bcca59422b8030780

See more details on using hashes here.

File details

Details for the file open_inference_openapi-2.0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for open_inference_openapi-2.0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9d2b46cff9be8bd3f56adad171d2fa6986e4756413d09deff6e1bf2d85cae124
MD5 102cfa2e6b4867c0833235f2bbb71f92
BLAKE2b-256 8aec88d723d8025d00942d506e802648e2cc5342e714a338409e106f9c82d300

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page