Abstraction for AI Inference Client
Project description
Python Infer Client
About Python Infer Client
Python Infer Client is a python inference client library. It provides one interface to interact with many types of inference client as onnxruntime, tritonclient...
Install
$ pip install infer-client
Usage
import numpy as np
from infer_client.adapters.onnx import OnnxInferenceAdapter
from infer_client.inference import Inference
adapter = OnnxInferenceAdapter(model_name="resources/test_classify", version="1", limit_mem_gpu=-1)
infer_client_obj = Inference(adapter)
res = infer_client_obj.inference({"input": np.random.rand(1, 3, 224, 224)}, ["output"])
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Credits
License
The MIT License (MIT). Please see License File for more information.
Reference
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
infer-client-0.0.1.tar.gz
(8.1 kB
view hashes)
Built Distribution
Close
Hashes for infer_client-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 196bff50c2f1b280d17bb92038e5bbfe89f861a8d85e36b98e57b870735ae477 |
|
MD5 | 360bfe8721a5904cbe5421ae710f2557 |
|
BLAKE2b-256 | 5ce1b56912d3432760e9ff7750b495f2acfd326a57f9c1e39ab9e2d6cfa5c2c3 |