Skip to main content

Scale LLM Engine Python client

Project description

LLM Engine

The LLM Engine Python library provides a convenient way of interfacing with a llmengine endpoint running on LLM Engine or on your own infrastructure.

Get Started

Install

pip install scale-llm-engine

Usage

If you are using LLM Engine, you can get your API key from https://spellbook.scale.com/settings. Set the SCALE_API_KEY environment variable to your API key.

If you are using your own infrastructure, you can set the LLM_ENGINE_BASE_PATH environment variable to the base URL of your self-hosted llmengine endpoint.

from llmengine import Completion

response = Completion.create(
    model="llama-2-7b",
    prompt="Hello, my name is",
    max_new_tokens=10,
    temperature=0.2,
)
print(response.outputs[0].text)

Documentation

Documentation is available at https://scaleapi.github.io/llm-engine/.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scale_llm_engine-0.0.0b34.tar.gz (23.1 kB view hashes)

Uploaded Source

Built Distribution

scale_llm_engine-0.0.0b34-py3-none-any.whl (27.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page