Skip to main content

Python Client for TorchServe APIs

Project description

TorchServe Python Client

Install

pip install torchserve_client

Usage

Using torchserve_client is a breeze! It has support for both REST APIs and gRPC APIs.

REST Client

To make calls to REST endpoint, simply initialize a TorchServeClientREST object as shown below:

from torchserve_client import TorchServeClientREST

# Initialize the REST TorchServeClient object
ts_client = TorchServeClientREST()
ts_client
TorchServeClient(base_url=http://localhost, management_port=8081, inference_port=8080)

If you wish to customize the base URL, management port, or inference port of your TorchServe server, you can pass them as arguments during initialization:

from torchserve_client import TorchServeClientREST

# Customize the base URL, management port, and inference port
ts_client = TorchServeClientREST(base_url='http://your-torchserve-server.com', 
                             management_port=8081, inference_port=8080)
ts_client
TorchServeClient(base_url=http://your-torchserve-server.com, management_port=8081, inference_port=8080)

gRPC Client

To create a gRPC client, simply create a TorchServeClientGRPC object

from torchserve_client import TorchServeClientGRPC

# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC()
ts_client
TorchServeClientGRPC(base_url=localhost, management_port=7071, inference_port=7070)

To customize base URL and default ports, pass them as arguments during initialization

from torchserve_client import TorchServeClientGRPC

# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC(base_url='http://your-torchserve-server.com', 
                             management_port=7071, inference_port=7070)
ts_client
TorchServeClientGRPC(base_url=your-torchserve-server.com, management_port=7071, inference_port=7070)

With these intuitive APIs at your disposal, you can harness the full power of the Management and Inference API and take your application to next level. Happy inferencing! 🚀🔥

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchserve_client-0.0.2.tar.gz (14.8 kB view hashes)

Uploaded Source

Built Distribution

torchserve_client-0.0.2-py3-none-any.whl (16.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page