BentoML: Package and Deploy Your Machine Learning Models
From a model in ipython notebook to production API service in 5 minutes.
BentoML is a python library for packaging and deploying machine learning models. It provides high-level APIs for defining a ML service and bundling its artifacts, source code, dependencies, and configurations into a production-system-friendly format that are ready for deployment.
- Feature Highlights
- Getting Started
- More About BentoML
- Releases and Contributing
Multiple Distribution Format - Easily bundle your Machine Learning models into format that works best with your inference scenario:
- Docker Image - include built-in REST API Server
- PyPI Package - integrate with your python applications seamlessly
- CLI tool - put your model into Airflow DAG or CI/CD pipeline
- Spark UDF - run batch serving on large dataset with Spark
- Serverless Function - host your model with serverless cloud platforms
Multiple Framework Support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Scikit-Learn, xgboost and can be easily extended to work with new or custom frameworks.
Deploy Anywhere - BentoML bundled ML service can be easily deploy with platforms such as Docker, Kubernetes, Serverless, Airflow and Clipper, on cloud platforms including AWS Lambda/ECS/SageMaker, Gogole Cloud Functions, and Azure ML.
Custom Runtime Backend - Easily integrate your python preprocessing code with high-performance deep learning model runtime backend (such as tensorflow-serving) to deploy low-latancy serving endpoint.
pip install bentoml
BentoML does not change your training workflow. Let's train a simple scikit-learn model as example:
from sklearn import svm from sklearn import datasets clf = svm.SVC(gamma='scale') iris = datasets.load_iris() X, y = iris.data, iris.target clf.fit(X, y)
To package this model with BentoML, you will need to create a new BentoService by subclassing it, and provides artifacts and env definition for it:
%%writefile iris_classifier.py from bentoml import BentoService, api, env, artifacts from bentoml.artifact import PickleArtifact from bentoml.handlers import DataframeHandler @artifacts([PickleArtifact('model')]) @env(conda_dependencies=["scikit-learn"]) class IrisClassifier(BentoService): @api(DataframeHandler) def predict(self, df): return self.artifacts.model.predict(df)
Now, to save your trained model for prodcution use, simply import your
BentoService class and
pack it with required artifacts:
from iris_classifier import IrisClassifier svc = IrisClassifier.pack(model=clf) svc.save('./saved_bento', version='v0.0.1') # Saving archive to ./saved_bento/IrisClassifier/v0.0.1/
That's it. Now you have created your first BentoArchive. It's a directory containing all the source code, data and configurations files required to run this model in production. There are a few ways you could use this archive:
Serving a BentoArchive via REST API
For exposing your model as a HTTP API endpoint, you can simply use the
bentoml serve command:
bentoml serve ./saved_bento/IrisClassifier/v0.0.1/
Note: you must ensure the pip and conda dependencies are available in your python
environment when using
bentoml serve command. More commonly we recommand using
BentoML API server with Docker(see below).
Build API server Docker Image from BentoArchive
You can build a Docker Image for running API server hosting your BentoML archive by using the archive folder as docker build context:
cd ./saved_bento/IrisClassifier/v0.0.1/ docker build -t myorg/iris-classifier .
Next, you can
docker push the image to your choice of registry for deployment,
or run it locally for development and testing:
docker run -p 5000:5000 myorg/iris-classifier
Loading BentoArchive in Python
import bentoml bento_svc = bentoml.load('./saved_bento/IrisClassifier/v0.0.1/') bento_svc.predict(X)
BentoML also supports loading an archive from s3 location directly:
bento_svc = bentoml.load('s3://my-bento-svc/iris_classifier/')
Install BentoArchive as PyPI package
First install your exported bentoml service with
pip install ./saved_bento/IrisClassifier/v0.0.1/
Now you can import it and used it as a python module:
import IrisClassifier installed_svc = IrisClassifier.load() installed_svc.predict(X)
Note that you could also publish your exported BentoService as a PyPI package as a public python package on pypi.org or upload to your organization's private PyPI index:
cd ./saved_bento/IrisClassifier/v0.0.1/ python setup.py sdist upload
Loading BentoArchive from CLI
pip install a BentoML archive, it also provides you with a CLI tool for
accsing your BentoService's apis from command line:
pip install ./saved_bento/IrisClassifier/v0.0.1/ IrisClassifier info IrisClassifier predict --input='./test.csv'
Alternatively, you can also use the
bentoml cli to load and run a BentoArchive
bentoml info ./saved_bento/IrisClassifier/v0.0.1/ bentoml predict ./saved_bento/IrisClassifier/v0.0.1/ --input='./test.csv'
CLI access made it very easy to put your saved BentoArchive into an Airflow DAG, integrate your packaged ML model into testing environment or use it in combination with other shell tools.
All examples can be found in the BentoML/examples directory.
- Quick Start with sklearn
- Sentiment Analysis with Scikit-Learn
- Text Classification with Tensorflow Keras
- Fashion MNIST classification with Pytorch
- Fashion MNIST classification with Tensorflow Keras
- More examples coming soon!
More About BentoML
We build BentoML because we think there should be a much simpler way for machine learning teams to ship models for production. They should not wait for engineering teams to re-implement their models for production environment or build complex feature pipelines for experimental models.
Our vision is to empower Machine Learning scientists to build and ship their own models end-to-end as production services, just like software engineers do. BentoML is enssentially this missing 'build tool' for Machine Learing projects.
Releases and Contributing
BentoML is under active development. Current version is a beta release, we may change APIs in future releases.
Want to help build BentoML? Check out our contributing documentation.
BentoML is GPL-3.0 licensed, as found in the COPYING file.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size BentoML-0.0.9-py3-none-any.whl (65.0 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size BentoML-0.0.9.tar.gz (28.0 kB)||File type Source||Python version None||Upload date||Hashes View|