Scale LLM Engine Python client
Project description
LLM Engine
The LLM Engine Python library provides a convenient way of interfacing with a
llmengine
endpoint running on
LLM Engine or on your own infrastructure.
Get Started
Install
pip install scale-llm-engine
Usage
If you are using LLM Engine, you can get your API key from
https://spellbook.scale.com/settings.
Set the SCALE_API_KEY
environment variable to your API key.
If you are using your own infrastructure, you can set the
LLM_ENGINE_SERVE_BASE_PATH
environment variable to the base URL of your
self-hosted llmengine
endpoint.
from llmengine import Completion
response = Completion.create(
model="llama-2-7b",
prompt="Hello, my name is",
max_new_tokens=10,
temperature=0.2,
)
print(response.outputs[0].text)
Documentation
Documentation is available at https://scaleapi.github.io/llm-engine/.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
scale_llm_engine-0.0.0b9.tar.gz
(17.6 kB
view hashes)
Built Distribution
Close
Hashes for scale_llm_engine-0.0.0b9-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2b06c62505be819f6f7da94ebae2f467c748bef43f46acca2245ced5d05f48f2 |
|
MD5 | df3a8e99f38f9e586b9727e2c15440f9 |
|
BLAKE2b-256 | 4e3fe6fa4e956e832bdafbada2acf38c846fc4537f04c53c77f27f73508d34b6 |