Library to easily interface with LLM API providers
Project description
🚅 LiteLLM
Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, Azure OpenAI etc.]
100+ Supported Models | Docs | Demo Website
LiteLLM manages
- Translating inputs to the provider's completion and embedding endpoints
- Guarantees consistent output, text responses will always be available at
['choices'][0]['message']['content']
- Exception mapping - common exceptions across providers are mapped to the OpenAI exception types
Usage
pip install litellm
from litellm import completion
## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"
os.environ["ANTHROPIC_API_KEY"] = "anthropic key"
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion(model="command-nightly", messages=messages)
# anthropic
response = completion(model="claude-2", messages=messages)
Stable version
pip install litellm==0.1.424
LiteLLM Client - debugging & 1-click add new LLMs
Debugging Dashboard 👉 https://docs.litellm.ai/docs/debugging/hosted_debugging
Streaming
liteLLM supports streaming the model response back, pass stream=True
to get a streaming iterator in response.
Streaming is supported for OpenAI, Azure, Anthropic, Huggingface models
response = completion(model="gpt-3.5-turbo", messages=messages, stream=True)
for chunk in response:
print(chunk['choices'][0]['delta'])
# claude 2
result = completion('claude-2', messages, stream=True)
for chunk in result:
print(chunk['choices'][0]['delta'])
support / talk with founders
- Schedule Demo 👋
- Community Discord 💭
- Our numbers 📞 +1 (770) 8783-106 / +1 (412) 618-6238
- Our emails ✉️ ishaan@berri.ai / krrish@berri.ai
why did we build this
- Need for simplicity: Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
litellm-0.1.481.tar.gz
(59.6 kB
view hashes)
Built Distribution
litellm-0.1.481-py3-none-any.whl
(69.1 kB
view hashes)
Close
Hashes for litellm-0.1.481-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8926c146c8d93e61d2fb834cbf2adb1d8170a6c18d77a34b4e6bc791219179bc |
|
MD5 | d481700eefdfedbb4d6edb143dc258f7 |
|
BLAKE2b-256 | 59a878d0e5d1e7fcb426641e45bbd1578e1437a70344deb9a2b86262d1611e80 |