Skip to main content

Machine learning prediction serving

Project description

Build Status Codacy Grade Badge Codacy Coverage Badge PyPI version

ServeIt lets you easily serve model predictions and supplementary information from a RESTful API. Current features include:

  1. Model inference serving via RESTful API endpoint

  2. Extensible library for inference-time data loading, preprocessing, input validation, and postprocessing

  3. Supplementary information endpoint creation

  4. Automatic JSON serialization of responses

  5. Configurable request and response logging (work in progress)

Installation: Python 2.7 and Python 3.6

Installation is easy with pip: pip install serveit

Usage:

Deploy your model clf to an API endpoint with as little as one line of code:

from serveit.server import ModelServer

# initialize server with a model and a method to use for predictions
# then start serving predictions
ModelServer(clf, clf.predict).serve()

Your new API is now accepting POST requests at localhost:5000/predictions! Please see the examples directory for additional usage.

Supported libraries

  • Scikit-Learn

  • Keras

Coming soon:

  • TensorFlow

  • PyTorch

Building

You can build locally with: python setup.py

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ServeIt-0.0.3b4.tar.gz (7.7 kB view hashes)

Uploaded Source

Built Distribution

ServeIt-0.0.3b4-py2.py3-none-any.whl (15.8 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page