Model Server for Apache MXNet is a tool for serving neural net models for inference
Use the MMS Server CLI, or the pre-configured Docker images, to start a service that sets up HTTP endpoints to handle model inference requests.
Detailed documentation and examples are provided in the docs folder.
If you wish to use ONNX with MMS, you will need to first install a protobuf compiler. This is not needed if you wish to serve MXNet models.
pip install mxnet-model-server
We welcome new contributors of all experience levels. For information on how to install MMS for development, refer to the MMS docs.
You can check the latest source code as follows:
git clone https://github.com/awslabs/mxnet-model-server.git
If you use MMS in a publication or project, please cite MMS: https://github.com/awslabs/mxnet-model-server
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size & hash SHA256 hash help||File type||Python version||Upload date|
|mxnet_model_server-0.4-py2.py3-none-any.whl (13.4 MB) Copy SHA256 hash SHA256||Wheel||py2.py3||May 25, 2018|