Simple package for ML Model serving with gRPC
Project description
AxonServe
Simple package for ML Model serving with gRPC
Installation
$ pip install axon_serve
Usage
Server
from axon_serve import PredictionService, GRPCService
# implement PredictionService class
class TestPredictionService(PredictionService):
def __init__(self)
super().__init__()
# override predict method with your custom prediction logic
# model_input and return values must be numpy arrays
# params is a dict with optional kwargs for model
def predict(self, model_input, params):
print("model_input: ", model_input.shape)
print("params: ", params)
return model_input
if __name__ == "__main__":
# instantiate your prediction service
test_prediction_service = TestPredictionService()
# register it with GRPCService providing the port
service = GRPCService(test_prediction_service, port=5005)
# start the server
service.start()
Client
from axon_serve import ModelServeClient
# instantiate the client providing host and port
serve_client = ModelServeClient(host="localhost", port=5005)
model_input = np.array([1, 2, 3])
params = {"test_param": true}
# call predict method to send request to server
result = serve_client.predict(model_input, params)
print(result.shape)
TODOs
- add tests
- add support for secure channels
- add support for arbitrary tensor serialization
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
axon_serve-0.0.1.tar.gz
(6.6 kB
view hashes)
Built Distribution
Close
Hashes for axon_serve-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 378ba057c427012d43cc003d6eeafbab796232843d938f53bac0495f192fe04f |
|
MD5 | 936c59a257ed0e5508701aba60656b60 |
|
BLAKE2b-256 | 149921dc67db9abefa743ca88b063bb46387903d7c3b3814f6ce46f34240a305 |