Skip to main content

BentoML: The easiest way to serve AI apps and models

Project description

BentoML Unsloth integrations

Installation

pip install "bentoml[unsloth]"

Examples.

See train.py

API

To use this integration, one can use bentoml.unsloth.build_bento:

bentoml.unsloth.build_bento(model, tokenizer)

If you model is continued froma fine-tuned checkpoint, then model_name must be passed as well:

bentoml.unsloth.build_bento(model, tokenizer, model_name="llama-3-continued-from-checkpoint")

[!important]

Make sure to save the chat templates to tokenizer instance to make sure generations are correct based on how you setup your data pipeline. See example and documentation for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bentoml_unsloth-0.1.2.tar.gz (6.0 kB view hashes)

Uploaded Source

Built Distribution

bentoml_unsloth-0.1.2-py3-none-any.whl (6.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page