Skip to main content

Multi Model Server is a tool for serving neural net models for inference

Project description

Multi Model Server (MMS) is a flexible and easy to use tool for serving deep learning models exported from MXNet or the Open Neural Network Exchange (ONNX).

Use the MMS Server CLI, or the pre-configured Docker images, to start a service that sets up HTTP endpoints to handle model inference requests.

Detailed documentation and examples are provided in the docs folder.

Prerequisites

  • java 8: Required. MMS use java to serve HTTP requests. You must install java 8 (or later) and make sure java is on available in $PATH environment variable before installing MMS. If you have multiple java installed, you can use $JAVA_HOME environment vairable to control which java to use.

  • mxnet: mxnet will not be installed by default with MMS 1.0 any more. You have to install it manually if you use MxNet.

For ubuntu:

sudo apt-get install openjdk-8-jre-headless

For centos

sudo yum install java-1.8.0-openjdk

For Mac:

brew tap caskroom/versions
brew update
brew cask install java8

Install MxNet:

pip install mxnet

MXNet offers MKL pip packages that will be much faster when running on Intel hardware. To install mkl package for CPU:

pip install mxnet-mkl

or for GPU instance:

pip install mxnet-cu92mkl

Installation

pip install multi-model-server

Development

We welcome new contributors of all experience levels. For information on how to install MMS for development, refer to the MMS docs.

Source code

You can check the latest source code as follows:

git clone https://github.com/awslabs/multi-model-server.git

Testing

After installation, try out the MMS Quickstart for

Help and Support

Citation

If you use MMS in a publication or project, please cite MMS: https://github.com/awslabs/multi-model-server

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

multi_model_server-1.1.0b20191213-py2.py3-none-any.whl (4.5 MB view details)

Uploaded Python 2Python 3

File details

Details for the file multi_model_server-1.1.0b20191213-py2.py3-none-any.whl.

File metadata

  • Download URL: multi_model_server-1.1.0b20191213-py2.py3-none-any.whl
  • Upload date:
  • Size: 4.5 MB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/39.2.0 requests-toolbelt/0.9.1 tqdm/4.40.2 CPython/2.7.12

File hashes

Hashes for multi_model_server-1.1.0b20191213-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 7f24e941fac6520b6bd0c3db883d5fea6327a47e8d3eabc33096b4fae6f2f24a
MD5 830fdc0be3e331636f769513e41df9e3
BLAKE2b-256 52f987fe3415f7ec8143a8ebef0dcf188b13bb8b4eddfc8d8eebdb1c1ae10fb5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page