A platform for serving and deploying machine learning models in the cloud
Project description
From ML model to production API endpoint with a few lines of code
Getting Started | Documentation | Gallery | Contributing | Releases | License | Blog
BentoML makes it easy to serve and deploy machine learning models in the cloud.
It is an open source framework for machine learning teams to build cloud-native prediction API services that are ready for production. BentoML supports most popular ML training frameworks and common deployment platforms including major cloud providers and docker/kubernetes.
👉 Join BentoML Slack community to hear about the latest development updates.
Getting Started
Installation with pip:
pip install bentoml
Defining a prediction service with BentoML:
import bentoml
from bentoml.handlers import DataframeHandler
from bentoml.artifact import SklearnModelArtifact
@bentoml.env(pip_dependencies=["scikit-learn"]) # defining pip/conda dependencies to be packed
@bentoml.artifacts([SklearnModelArtifact('model')]) # defining required artifacts, typically trained models
class IrisClassifier(bentoml.BentoService):
@bentoml.api(DataframeHandler) # defining prediction service endpoint and expected input format
def predict(self, df):
# Pre-processing logic and access to trained mdoel artifacts in API function
return self.artifacts.model.predict(df)
Train a classifier model with default Iris dataset and pack the trained model
with the BentoService IrisClassifier
defined above:
from sklearn import svm
from sklearn import datasets
if __name__ == "__main__":
clf = svm.SVC(gamma='scale')
iris = datasets.load_iris()
X, y = iris.data, iris.target
clf.fit(X, y)
# Create a iris classifier service
iris_classifier_service = IrisClassifier()
# Pack it with the newly trained model artifact
iris_classifier_service.pack('model', clf)
# Save the prediction service to a BentoService bundle
saved_path = iris_classifier_service.save()
A BentoService bundle is a versioned file archive, containing the BentoService you defined, along with trained model artifacts, dependencies and configurations.
Now you can start a REST API server based off the saved BentoService bundle form command line:
bentoml serve {saved_path}
If you are doing this only local machine, visit http://127.0.0.1:5000
in your browser to play around with the API server's Web UI for debbugging and
sending test request. You can also send prediction request with curl
from command line:
curl -i \
--header "Content-Type: application/json" \
--request POST \
--data '[[5.1, 3.5, 1.4, 0.2]]' \
http://localhost:5000/predict
Saved BentoService bundle is also structured to work as a docker build context, which can be used to build a docker image for deployment:
docker build -t my_api_server {saved_path}
The saved BentoService bundle can also be loaded directly from command line:
bentoml predict {saved_path} --input='[[5.1, 3.5, 1.4, 0.2]]'
# alternatively:
bentoml predict {saved_path} --input='./iris_test_data.csv'
The saved bundle is pip-installable and can be directly distributed as a PyPI package:
pip install {saved_path}
# Your BentoService class name will become packaged name
import IrisClassifier
installed_svc = IrisClassifier.load()
installed_svc.predict([[5.1, 3.5, 1.4, 0.2]])
Deploy the saved BentoService to cloud services such as AWS Lambda with the bentoml
command:
bentoml deployment create my-iris-classifier --bento IrisClassifier:{VERSION} --platform=aws-lambda
To learn more, try out our 5-mins Quick Start notebook using BentoML to turn a trained sklearn model into a containerized REST API server, and then deploy it to AWS Lambda: Download, Google Colab, nbviewer
Examples
FastAI
- Pet Image Classification - Google Colab | nbviewer | source
- Salary Range Prediction - Google Colab | nbviewer | source
Scikit-Learn
- Sentiment Analysis - Google Colab | nbviewer | source
PyTorch
- Fashion MNIST - Google Colab | nbviewer | source
- CIFAR-10 Image Classification - Google Colab | nbviewer | source
Keras
- Fashion MNIST - Google Colab | nbviewer | source
- Text Classification - Google Colab | nbviewer | source
- Toxic Comment Classifier - Google Colab | nbviewer | source
XGBoost
- Titanic Survival Prediction - Google Colab | nbviewer | source
- League of Legend win Prediction - Google Colab | nbviewer | source
LightGBM
- Titanic Survival Prediction - Google Colab | nbviewer | source
H2O
- Loan Default Prediction - Google Colab | nbviewer | source
- Prostate Cancer Prediction - Google Colab | nbviewer | source
Visit bentoml/gallery repository for more example projects demonstrating how to use BentoML.
Deployment guides:
- BentoML AWS Lambda Deployment Guide
- BentoML AWS SageMaker Deployment Guide
- BentoML Clipper.ai Deployment Guide
- BentoML AWS ECS Deployment Guide
- BentoML Google Cloud Run Deployment Guide
- (Beta) BentoML Kubernetes Deployment Guide
Feature Highlights
-
Multiple Distribution Format - Easily package your Machine Learning models and preprocessing code into a format that works best with your inference scenario:
- Docker Image - deploy as containers running REST API Server
- PyPI Package - integrate into your python applications seamlessly
- CLI tool - put your model into Airflow DAG or CI/CD pipeline
- Spark UDF - run batch serving on a large dataset with Spark
- Serverless Function - host your model on serverless platforms such as AWS Lambda
-
Multiple Framework Support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Keras, Scikit-Learn, xgboost, H2O, FastAI and can be easily extended to work with new or custom frameworks
-
Deploy Anywhere - BentoService bundle can be easily deployed with platforms such as Docker, Kubernetes, Serverless, Airflow and Clipper, on cloud platforms including AWS, Google Cloud, and Azure
-
Custom Runtime Backend - Easily integrate your python pre-processing code with high-performance deep learning runtime backend, such as tensorflow-serving
-
Workflow Designed For Teams - The YataiService component in BentoML provides Web UI and APIs for managing and deploying all the models and preidciton services your team has created or deployed, in a centralized service.
Documentation
Full documentation and API references can be found at bentoml.readthedocs.io
Usage Tracking
BentoML library by default reports basic usages using Amplitude. It helps BentoML authors to understand how people are using this tool and improve it over time. You can easily opt-out by running the following command from terminal:
bentoml config set usage_tracking=false
Contributing
Have questions or feedback? Post a new github issue or discuss in our Slack channel:
Want to help build BentoML? Check out our contributing guide and the development guide.
Releases
BentoML is under active development and is evolving rapidly. Currently it is a Beta release, we may change APIs in future releases.
Read more about the latest features and changes in BentoML from the releases page.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file BentoML-0.5.6.tar.gz
.
File metadata
- Download URL: BentoML-0.5.6.tar.gz
- Upload date:
- Size: 451.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2.post20191203 requests-toolbelt/0.9.1 tqdm/4.40.2 CPython/3.7.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a746e7c8ecdf62780e4f6ee68d117f02525627bfc454d9bcb371f5c0f255b57f |
|
MD5 | c0be687d3945dbcd9a7164252d595466 |
|
BLAKE2b-256 | 4ef16705d9a83440d2bbc1c4fab3f308c2b8d5c33cc44443681ced686ab60cf5 |
File details
Details for the file BentoML-0.5.6-py3-none-any.whl
.
File metadata
- Download URL: BentoML-0.5.6-py3-none-any.whl
- Upload date:
- Size: 533.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2.post20191203 requests-toolbelt/0.9.1 tqdm/4.40.2 CPython/3.7.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0bc1e3aa7009190c8d724b51a4b3e89ad327ef3de6fae53d361621b38a923437 |
|
MD5 | 65aa2efaa76e6cb25a976dba90299f11 |
|
BLAKE2b-256 | 881899943da4bf353afc2ab30adda9225800cd272323ad6e9a9995d54569ee93 |