Skip to main content

Library to easily interface with LLM API providers

Project description

🚅 LiteLLM

Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, TogetherAI, Azure, OpenAI, etc.]

PyPI Version Stable Version CircleCI Downloads Y Combinator W23 git commit activity

Open In Colab

100+ Supported Models | Docs | Demo Website

📣1-click deploy your own LLM proxy server. Grab time, if you're interested!

LiteLLM manages

  • Translating inputs to the provider's completion and embedding endpoints
  • Guarantees consistent output, text responses will always be available at ['choices'][0]['message']['content']
  • Exception mapping - common exceptions across providers are mapped to the OpenAI exception types

Usage

Open In Colab
pip install litellm
from litellm import completion
import os
## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages=messages)

Stable version

pip install litellm==0.1.424

Streaming

liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models

response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
for chunk in response:
    print(chunk['choices'][0]['delta'])

# claude 2
result = completion('claude-2', messages, stream=True)
for chunk in result:
  print(chunk['choices'][0]['delta'])

Support / talk with founders

Why did we build this

  • Need for simplicity: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere

Contributors

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

litellm-0.1.690.tar.gz (91.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

litellm-0.1.690-py3-none-any.whl (124.4 kB view details)

Uploaded Python 3

File details

Details for the file litellm-0.1.690.tar.gz.

File metadata

  • Download URL: litellm-0.1.690.tar.gz
  • Upload date:
  • Size: 91.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for litellm-0.1.690.tar.gz
Algorithm Hash digest
SHA256 51271a11f4c963da23a9b7bb75dca9a8de4c5dee1b9906057e59ca9e51ac400a
MD5 d972461dc4f96c8eff8f6771861cccdd
BLAKE2b-256 06153b601476f3e41a899568612b181ac0dbf06cc6fba86687f478045547da42

See more details on using hashes here.

File details

Details for the file litellm-0.1.690-py3-none-any.whl.

File metadata

  • Download URL: litellm-0.1.690-py3-none-any.whl
  • Upload date:
  • Size: 124.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.18

File hashes

Hashes for litellm-0.1.690-py3-none-any.whl
Algorithm Hash digest
SHA256 b936b80bdeba02a4a278c94fdcf4eff22904e531af86b7e1a8c01518a5ac00a6
MD5 d72f8863365d6d621fded2de488c9f92
BLAKE2b-256 bb6adb13699a9ac5ed50db3544271a52aeeeb32dab30ab9b3a6d067e84c37cb7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page