Skip to main content

Library to easily interface with LLM API providers

Project description

🚅 litellm

PyPI Version PyPI Version CircleCI Downloads

a light package to simplify calling OpenAI, Azure, Cohere, Anthropic, Huggingface API Endpoints. It manages:

  • translating inputs to the provider's completion and embedding endpoints
  • guarantees consistent output, text responses will always be available at ['choices'][0]['message']['content']
  • exception mapping - common exceptions across providers are mapped to the OpenAI exception types

usage

None

Demo - https://litellm.ai/playground
Docs - https://docs.litellm.ai/docs/
Free Dashboard - https://docs.litellm.ai/docs/debugging/hosted_debugging

quick start

pip install litellm
from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages)

Code Sample: Getting Started Notebook

Stable version

pip install litellm==0.1.424

Streaming Queries

liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models

response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
for chunk in response:
    print(chunk['choices'][0]['delta'])

# claude 2
result = completion('claude-2', messages, stream=True)
for chunk in result:
  print(chunk['choices'][0]['delta'])

support / talk with founders

why did we build this

  • Need for simplicity: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

litellm-0.1.465.tar.gz (58.3 kB view hashes)

Uploaded Source

Built Distribution

litellm-0.1.465-py3-none-any.whl (66.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page