Model Server for Apache MXNet is a tool for serving neural net models for inference
Project description
Apache MXNet Model Server (MMS) is a flexible and easy to use tool for serving deep learning models exported from MXNet or the Open Neural Network Exchange (ONNX).
Use the MMS Server CLI, or the pre-configured Docker images, to start a service that sets up HTTP endpoints to handle model inference requests.
Detailed documentation and examples are provided in the docs folder.
Prerequisites
If you wish to use ONNX with MMS, you will need to first install a protobuf compiler. This is not needed if you wish to serve MXNet models.
Installation
pip install mxnet-model-server
Development
We welcome new contributors of all experience levels. For information on how to install MMS for development, refer to the MMS docs.
Important links
Source code
You can check the latest source code as follows:
git clone https://github.com/awslabs/mxnet-model-server.git
Testing
After installation, try out the MMS Quickstart for Serving a Model and Exporting a Model.
Help and Support
Citation
If you use MMS in a publication or project, please cite MMS: https://github.com/awslabs/mxnet-model-server
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Hashes for mxnet_model_server-1.0b20181009-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ef873479eb42294b4768a50031ecb35819cb581d9edf2bbd6983fd418662a004 |
|
MD5 | 29085e413f28898fd282e5f14b1c9c18 |
|
BLAKE2b-256 | 83a10ab27e48a01417c6ea6985dc4b433296f74844ed30e0464abe433270ac09 |