Skip to main content

Python Client for TorchServe APIs

Project description

TorchServe Python Client

Install

pip install torchserve_client

Usage

Using torchserve_client is a breeze! It has support for both REST APIs and gRPC APIs.

REST Client

To make calls to REST endpoint, simply initialize a TorchServeClientREST object as shown below:

from torchserve_client import TorchServeClientREST

# Initialize the REST TorchServeClient object
ts_client = TorchServeClientREST()
ts_client
TorchServeClient(base_url=http://localhost, management_port=8081, inference_port=8080)

If you wish to customize the base URL, management port, or inference port of your TorchServe server, you can pass them as arguments during initialization:

from torchserve_client import TorchServeClientREST

# Customize the base URL, management port, and inference port
ts_client = TorchServeClientREST(base_url='http://your-torchserve-server.com', 
                             management_port=8081, inference_port=8080)
ts_client
TorchServeClient(base_url=http://your-torchserve-server.com, management_port=8081, inference_port=8080)

gRPC Client

To create a gRPC client, simply create a TorchServeClientGRPC object

from torchserve_client import TorchServeClientGRPC

# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC()
ts_client
TorchServeClientGRPC(base_url=localhost, management_port=7071, inference_port=7070)

To customize base URL and default ports, pass them as arguments during initialization

from torchserve_client import TorchServeClientGRPC

# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC(base_url='http://your-torchserve-server.com', 
                             management_port=7071, inference_port=7070)
ts_client
TorchServeClientGRPC(base_url=your-torchserve-server.com, management_port=7071, inference_port=7070)

With these intuitive APIs at your disposal, you can harness the full power of the Management and Inference API and take your application to next level. Happy inferencing! 🚀🔥

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchserve_client-0.0.2.tar.gz (14.8 kB view details)

Uploaded Source

Built Distribution

torchserve_client-0.0.2-py3-none-any.whl (16.3 kB view details)

Uploaded Python 3

File details

Details for the file torchserve_client-0.0.2.tar.gz.

File metadata

  • Download URL: torchserve_client-0.0.2.tar.gz
  • Upload date:
  • Size: 14.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.17

File hashes

Hashes for torchserve_client-0.0.2.tar.gz
Algorithm Hash digest
SHA256 51de7a93884594f7e1fc88cadc3e55bda5f6e7720031437b5453d6c22e7c28ad
MD5 4b288b2159f6f99511e9ae7d81e2ad1f
BLAKE2b-256 d7e70f0faa551a101953d946ab2e0f0ee1082936849063e72020ba8f60677fab

See more details on using hashes here.

File details

Details for the file torchserve_client-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for torchserve_client-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 682446845ba743a678bfaefa5015f79d22e9fc266db3291a64d91e79b6f25bbc
MD5 e0c6f6aaaa26e564343d4bd3e3e2876d
BLAKE2b-256 3f7f6b7e19f40c7d607511e31cbd7f3bf7750c8b5b451b7f06dd9519e30d2a7f

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page