Skip to main content
Join the official 2020 Python Developers SurveyStart the survey!

A python framework for serving and operating machine learning models

Project description

pypi status python versions Downloads build status Documentation Status join BentoML Slack

From a model in jupyter notebook to production API service in 5 minutes


Getting Started | Documentation | Gallery | Contributing | Releases | License | Blog

BentoML is a flexible framework that accelerates the workflow of serving and deploying machine learning models in the cloud.

Check out our 5-mins Quickstart Notebook using BentoML to turn a trained sklearn model into a containerized REST API server, and then deploy it to AWS Lambda.

If you are using BentoML for production workloads or wants to contribute, be sure to join our Slack channel and hear our latest development updates: join BentoML Slack

Getting Started

Installation with pip:

pip install bentoml

Defining a prediction service with BentoML:

import bentoml
from bentoml.handlers import DataframeHandler
from bentoml.artifact import SklearnModelArtifact

class IrisClassifier(bentoml.BentoService):

    def predict(self, df):
        return self.artifacts.model.predict(df)

Train a classifier model with default Iris dataset and pack the trained model with the BentoService IrisClassifier defined above:

from sklearn import svm
from sklearn import datasets

clf = svm.SVC(gamma='scale')
iris = datasets.load_iris()
X, y =,, y)

# Create a iris classifier service with the newly trained model
iris_classifier_service = IrisClassifier.pack(model=clf)

# Save the entire prediction service to file bundle
saved_path =

A BentoML bundle is a versioned file archive, containing the BentoService you defined, along with trained model artifacts, dependencies and configurations.

Now you can start a REST API server based off the saved BentoML bundle form command line:

bentoml serve {saved_path}

If you are doing this only local machine, visit in your browser to play around with the API server's Web UI for debbugging and testing. You can also send prediction request with curl from command line:

curl -i \
  --header "Content-Type: application/json" \
  --request POST \
  --data '[[5.1, 3.5, 1.4, 0.2]]' \

The saved BentoML bundle can also be loaded directly from command line for inferencing:

bentoml predict {saved_path} --input='[[5.1, 3.5, 1.4, 0.2]]'

# alternatively:
bentoml predict {saved_path} --input='./iris_test_data.csv'

BentoML bundle is pip-installable and can be directly distributed as a PyPI package:

pip install {saved_path}
# Your bentoML model class name will become packaged name
import IrisClassifier

installed_svc = IrisClassifier.load()
installed_svc.predict([[5.1, 3.5, 1.4, 0.2]])

BentoML bundle is structured to work as a docker build context so you can easily build a docker image for this API server by using it as the build context directory:

docker build -t my_api_server {saved_path}

To learn more, try out the Getting Started with Bentoml notebook: Google Colab Badge








Visit bentoml/gallery repository for more example projects demonstrating how to use BentoML.

Deployment guides:

Project Overview

BentoML provides two set of high-level APIs:

  • BentoService: Turn your trained ML model into versioned file bundle that can be deployed as containerize REST API server, PyPI package, CLI tool, or batch/streaming job

  • YataiService: Manage and deploy your saved BentoML bundles into prediction services on Kubernetes cluster or cloud platforms such as AWS Lambda, SageMaker, Azure ML, and GCP Function etc

Feature Highlights

  • Multiple Distribution Format - Easily package your Machine Learning models and preprocessing code into a format that works best with your inference scenario:

    • Docker Image - deploy as containers running REST API Server
    • PyPI Package - integrate into your python applications seamlessly
    • CLI tool - put your model into Airflow DAG or CI/CD pipeline
    • Spark UDF - run batch serving on a large dataset with Spark
    • Serverless Function - host your model on serverless platforms such as AWS Lambda
  • Multiple Framework Support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Keras, Scikit-Learn, xgboost, H2O, FastAI and can be easily extended to work with new or custom frameworks.

  • Deploy Anywhere - BentoML bundle can be easily deployed with platforms such as Docker, Kubernetes, Serverless, Airflow and Clipper, on cloud platforms including AWS, Google Cloud, and Azure.

  • Custom Runtime Backend - Easily integrate your python pre-processing code with high-performance deep learning runtime backend, such as tensorflow-serving.


Full documentation and API references can be found at

Usage Tracking

BentoML library by default reports basic usages using Amplitude. It helps BentoML authors to understand how people are using this tool and improve it over time. You can easily opt-out by running the following command from terminal:

bentoml config set usage_tracking=false


Have questions or feedback? Post a new github issue or join our Slack channel: join BentoML Slack

Want to help build BentoML? Check out our contributing guide and the development guide.


BentoML is under active development and is evolving rapidly. Currently it is a Beta release, we may change APIs in future releases.

Read more about the latest features and changes in BentoML from the releases page.


Apache License 2.0

FOSSA Status

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for BentoML, version 0.4.5
Filename, size File type Python version Upload date Hashes
Filename, size BentoML-0.4.5-py3-none-any.whl (485.0 kB) File type Wheel Python version py3 Upload date Hashes View
Filename, size BentoML-0.4.5.tar.gz (436.1 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page