Skip to main content

Abstraction for AI Inference Client

Project description

Python Infer Client

Testing Latest Version Downloads Pypi Status Python Versions

About Python Infer Client

Python Infer Client is a python inference client library. It provides one interface to interact with many types of inference client as onnxruntime, tritonclient...

Install

With using the tritonclient client, only supported with GRPC

$ pip install infer-client[tritonclient]

With using the onnxruntime client, both CPU and GPU are supported

$ pip install infer-client[onnxruntime]
or
$ pip install infer-client[onnxruntime-gpu]

Usage

import numpy as np

from infer_client.adapters.onnx import OnnxInferenceAdapter
from infer_client.inference import Inference


adapter = OnnxInferenceAdapter(model_name="resources/test_classify", version="1", limit_mem_gpu=-1)
infer_client_obj = Inference(adapter)

res = infer_client_obj.inference({"input": np.random.rand(1, 3, 224, 224)}, ["output"])

Changelog

Please see CHANGELOG for more information on what has changed recently.

Contributing

Please see CONTRIBUTING for details.

Security Vulnerabilities

Please review our security policy on how to report security vulnerabilities.

Credits

License

The MIT License (MIT). Please see License File for more information.

Reference

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

infer-client-0.0.2.tar.gz (8.2 kB view hashes)

Uploaded Source

Built Distribution

infer_client-0.0.2-py3-none-any.whl (7.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page