BentoML: Package and Deploy Your Machine Learning Models
BentoML is open source tool for packaging machine learning models and their preprocessing code into container image or python library that can be easily used for testing and production deployment.
Best Practice Built-in - BentoML has a built-in model server supporting telemetrics and logging, making it easy to integrate with production systems. It tries to achieve best performance possible by enabling dynamic batching, caching, paralyzing preprocessing steps and customizable inference runtime.
Multiple framework support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Scikit-Learn, xgboost and can be easily extended to work with new or custom frameworks.
Streamlines deployment workflows - BentoML has built-in support for easily deploying models as REST API running on Kubernetes, AWS EC2, ECS, Google Cloud Platform, AWS SageMaker, and Azure ML.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size BentoML-0.0.5-py3-none-any.whl (36.5 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size BentoML-0.0.5.tar.gz (16.0 kB)||File type Source||Python version None||Upload date||Hashes View|