An open framework for building, shipping and running machine learning services
From a model in jupyter notebook to production API service in 5 minutes
BentoML is a python framework for building, shipping and running machine learning services. It provides high-level APIs for defining an ML service and packaging its artifacts, source code, dependencies, and configurations into a production-system-friendly format that is ready for deployment.
Use BentoML if you need to:
Turn your ML model into REST API server, Serverless endpoint, PyPI package, or CLI tool
Manage the workflow of creating and deploying a ML service
pip install bentoml
Defining a machine learning service with BentoML is as simple as a few lines of code:
@artifacts([PickleArtifact('model')]) @env(conda_pip_dependencies=["scikit-learn"]) class IrisClassifier(BentoService): @api(DataframeHandler) def predict(self, df): return self.artifacts.model.predict(df)
Multiple Distribution Format - Easily package your Machine Learning models and preprocessing code into a format that works best with your inference scenario:
- Docker Image - deploy as containers running REST API Server
- PyPI Package - integrate into your python applications seamlessly
- CLI tool - put your model into Airflow DAG or CI/CD pipeline
- Spark UDF - run batch serving on a large dataset with Spark
- Serverless Function - host your model on serverless platforms such as AWS Lambda
Multiple Framework Support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Scikit-Learn, xgboost, H2O, FastAI and can be easily extended to work with new or custom frameworks.
Deploy Anywhere - BentoML bundled ML service can be easily deployed with platforms such as Docker, Kubernetes, Serverless, Airflow and Clipper, on cloud platforms including AWS, Google Cloud, and Azure.
Custom Runtime Backend - Easily integrate your python pre-processing code with high-performance deep learning runtime backend, such as tensorflow-serving.
Full documentation and API references can be found at bentoml.readthedocs.io
All examples can be found under the BentoML/examples directory. More tutorials and examples coming soon!
- - Quick Start Guide
- - Scikit-learn Sentiment Analysis
- - H2O Classification
- - Keras Text Classification
- - XGBoost Titanic Survival Prediction
- - FastAI Pet Classification
- (WIP) PyTorch Fashion MNIST classification
- (WIP) Tensorflow Keras Fashion MNIST classification
- Serverless deployment with AWS Lambda
- API server deployment with AWS SageMaker
- (WIP) API server deployment on Kubernetes
- (WIP) API server deployment with Clipper
We collect example notebook page views to help us improve this project.
To opt-out of tracking, delete the
[Impression] line in the first markdown cell of any example notebook:
Have questions or feedback? Post a new github issue or join our Slack chat room:
To make sure you have a pleasant experience, please read the code of conduct. It outlines core values and beliefs and will make working together a happier experience.
BentoML is under active development and is evolving rapidly. Currently it is a Beta release, we may change APIs in future releases.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size BentoML-0.2.2-py3-none-any.whl (99.6 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size BentoML-0.2.2.tar.gz (51.7 kB)||File type Source||Python version None||Upload date||Hashes View|