Skip to main content

BentoML: Package and Deploy Your Machine Learning Models

Project description

BentoML

project status pypi status python versions build_status

BentoML is open source tool for packaging machine learning models and their preprocessing code into container image or python library that can be easily used for testing and production deployment.

  • Best Practice Built-in - BentoML has a built-in model server supporting telemetrics and logging, making it easy to integrate with production systems. It tries to achieve best performance possible by enabling dynamic batching, caching, paralyzing preprocessing steps and customizable inference runtime.

  • Multiple framework support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Scikit-Learn, xgboost and can be easily extended to work with new or custom frameworks.

  • Streamlines deployment workflows - BentoML has built-in support for easily deploying models as REST API running on Kubernetes, AWS EC2, ECS, Google Cloud Platform, AWS SageMaker, and Azure ML.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

BentoML-0.0.7.dev0.tar.gz (19.4 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page