Tiny configuration for Triton Inference Server
Project description
tritony - Tiny configuration for Triton Inference Server
What is this?
If you see the official example, it is really confusing to use where to start.
Use tritony! You will get really short lines of code like example below.
import argparse
import os
from glob import glob
import numpy as np
from PIL import Image
from tritony import InferenceClient
def preprocess(img, dtype=np.float32, h=224, w=224, scaling="INCEPTION"):
sample_img = img.convert("RGB")
resized_img = sample_img.resize((w, h), Image.Resampling.BILINEAR)
resized = np.array(resized_img)
if resized.ndim == 2:
resized = resized[:, :, np.newaxis]
scaled = (resized / 127.5) - 1
ordered = np.transpose(scaled, (2, 0, 1))
return ordered.astype(dtype)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument("--image_folder", type=str, help="Input folder.")
FLAGS = parser.parse_args()
client = InferenceClient.create_with("densenet_onnx", "0.0.0.0:8001", input_dims=3, protocol="grpc")
client.output_kwargs = {"class_count": 1}
image_data = []
for filename in glob(os.path.join(FLAGS.image_folder, "*")):
image_data.append(preprocess(Image.open(filename)))
result = client(np.asarray(image_data))
for output in result:
max_value, arg_max, class_name = output[0].decode("utf-8").split(":")
print(f"{max_value} ({arg_max}) = {class_name}")
Key Features
- Simple configuration. Only
$host:$port
and$model_name
are required. - Generating asynchronous requests with
asyncio.Queue
- Simple Model switching
- Support async tritonclient
Requirements
$ pip install tritonclient[all]
Install
$ pip install tritony
Test
With Triton
docker run --rm \
-v ${PWD}:/models \
nvcr.io/nvidia/tritonserver:22.01-pyt-python-py3 \
tritonserver --model-repo=/models
pytest -m -s tests/test_tritony.py
Example with image_client.py
- Follow steps in the official triton server documentation
# Download Images from https://github.com/triton-inference-server/server.git
python ./example/image_client.py --image_folder "./server/qa/images"
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
tritony-0.0.9.tar.gz
(11.0 kB
view details)
Built Distribution
File details
Details for the file tritony-0.0.9.tar.gz
.
File metadata
- Download URL: tritony-0.0.9.tar.gz
- Upload date:
- Size: 11.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 128541208aa3332bc3ac3e6e342d50844986b5c94ce3344cd53defd14a8829eb |
|
MD5 | 6dbf408f268df14824b8764fde954f8d |
|
BLAKE2b-256 | b95a0aa777fa6a1f64a7c09b2bc4a41826e501079840bb08776d1b61f206c778 |
File details
Details for the file tritony-0.0.9-py2.py3-none-any.whl
.
File metadata
- Download URL: tritony-0.0.9-py2.py3-none-any.whl
- Upload date:
- Size: 9.1 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3286a6255a30d499aeb924d96a55a357f9b262b5de337625becfca1aa4c02692 |
|
MD5 | aad7f79ad1a7f8620ee7c5f2d9cf4cd6 |
|
BLAKE2b-256 | 0c88584691d4fb2375da159f6731170a235c197e6753dd6de78a4ecf2da0cfbd |