Skip to main content

An open framework for building, shipping and running machine learning services

Project description

BentoML

From a model in jupyter notebook to production API service in 5 minutes.

project status build status Documentation Status pypi status python versions Downloads

BentoML is a python framework for building, shipping and running machine learning services. It provides high-level APIs for defining an ML service and packaging its artifacts, source code, dependencies, and configurations into a production-system-friendly format that is ready for deployment.

Google Colab Badge


Feature Highlights

  • Multiple Distribution Format - Easily package your Machine Learning models and preprocessing code into a format that works best with your inference scenario:

    • Docker Image - deploy as containers running REST API Server
    • PyPI Package - integrate into your python applications seamlessly
    • CLI tool - put your model into Airflow DAG or CI/CD pipeline
    • Spark UDF - run batch serving on a large dataset with Spark
    • Serverless Function - host your model on serverless platforms such as AWS Lambda
  • Multiple Framework Support - BentoML supports a wide range of ML frameworks out-of-the-box including Tensorflow, PyTorch, Scikit-Learn, xgboost and can be easily extended to work with new or custom frameworks.

  • Deploy Anywhere - BentoML bundled ML service can be easily deployed with platforms such as Docker, Kubernetes, Serverless, Airflow and Clipper, on cloud platforms including AWS, Gogole Cloud, and Azure.

  • Custom Runtime Backend - Easily integrate your python pre-processing code with high-performance deep learning runtime backend, such as tensorflow-serving.

Installation

python versions pypi status

pip install bentoml

Verify installation:

bentoml --version

Getting Started

Defining a machine learning service with BentoML is as simple as a few lines of code:

@artifacts([PickleArtifact('model')])
@env(conda_pip_dependencies=["scikit-learn"])
class IrisClassifier(BentoService):

    @api(DataframeHandler)
    def predict(self, df):
        return self.artifacts.model.predict(df)

Read our 5-mins Quick Start Guide, showcasing how to productionize a scikit-learn model and deploy it to AWS Lambda.

Documentation

Official BentoML documentation can be found at bentoml.readthedocs.io

Examples

All examples can be found under the BentoML/examples directory. More tutorials and examples coming soon!

Deployment guides:

We collect example notebook page views to help us improve this project. To opt-out of tracking, delete the [Impression] line in the first markdown cell of any example notebook: ![Impression](http...

Releases and Contributing

BentoML is under active development and is evolving rapidly. Currently it is a Beta release, we may change APIs in future releases.

To make sure you have a pleasant experience, please read the code of conduct. It outlines core values and beliefs and will make working together a happier experience.

Have questions or feedback? Post a new github issue or join our gitter chat room: join the chat at https://gitter.im/bentoml/BentoML

Want to help build BentoML? Check out our contributing guide and the development guide for setting up local development and testing environments for BentoML.

Happy hacking!

License

BentoML is under Apache License 2.0, as found in the LICENSE file.

FOSSA Status

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

BentoML-0.2.1.tar.gz (47.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

BentoML-0.2.1-py3-none-any.whl (94.7 kB view details)

Uploaded Python 3

File details

Details for the file BentoML-0.2.1.tar.gz.

File metadata

  • Download URL: BentoML-0.2.1.tar.gz
  • Upload date:
  • Size: 47.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.29.1 CPython/3.7.3

File hashes

Hashes for BentoML-0.2.1.tar.gz
Algorithm Hash digest
SHA256 959b7c668ac6cfd00028fd3c0e627ce59d66722aa0c2171b4a3ef4bb9e0f8ba6
MD5 6f1a729d57cf255f7c4200422ff20dad
BLAKE2b-256 dfd4b6148ed9a9b521eaec4eafbe9822395fe9ac9dd5106c992fdfd9a6104458

See more details on using hashes here.

File details

Details for the file BentoML-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: BentoML-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 94.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.3 requests-toolbelt/0.8.0 tqdm/4.29.1 CPython/3.7.3

File hashes

Hashes for BentoML-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 4d1fd88e79b9c7636a033423315d1034c64d3654098f1c1b87637e1c82b5f449
MD5 cfcc0abdd85d2f2eb5e837cb14a00c6b
BLAKE2b-256 c2942f9ccdeb76cc6743fb7c1b9b3cc1a0608c398ac1208206a7765e922c9f85

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page