llama-index llms litellm integration
Project description
LlamaIndex Llms Integration: Litellm
Installation
-
Install the required Python packages:
%pip install llama-index-llms-litellm !pip install llama-index
Usage
Import Required Libraries
import os
from llama_index.llms.litellm import LiteLLM
from llama_index.core.llms import ChatMessage
Set Up Environment Variables
Set your API keys as environment variables:
os.environ["OPENAI_API_KEY"] = "your-api-key"
os.environ["COHERE_API_KEY"] = "your-api-key"
Example: OpenAI Call
To interact with the OpenAI model:
message = ChatMessage(role="user", content="Hey! how's it going?")
llm = LiteLLM("gpt-3.5-turbo")
chat_response = llm.chat([message])
print(chat_response)
Example: Cohere Call
To interact with the Cohere model:
llm = LiteLLM("command-nightly")
chat_response = llm.chat([message])
print(chat_response)
Example: Chat with System Message
To have a chat with a system role:
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="Tell me a story"),
]
resp = LiteLLM("gpt-3.5-turbo").chat(messages)
print(resp)
Streaming Responses
To use the streaming feature with stream_complete
:
llm = LiteLLM("gpt-3.5-turbo")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
print(r.delta, end="")
Streaming Chat Example
To stream chat messages:
llm = LiteLLM("gpt-3.5-turbo")
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
Asynchronous Example
For asynchronous calls, use:
llm = LiteLLM("gpt-3.5-turbo")
resp = await llm.acomplete("Paul Graham is ")
print(resp)
LLM Implementation example
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_index_llms_litellm-0.3.0.tar.gz
.
File metadata
- Download URL: llama_index_llms_litellm-0.3.0.tar.gz
- Upload date:
- Size: 7.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 06535d11e0ecf347342641aba1de80f06b521f8086614a71a5569e41371fb630 |
|
MD5 | 85fa625e9111b840bf81947b4a2a294c |
|
BLAKE2b-256 | 6ba61f82167194a27ac4aefa7314c7bcebc87478eed81caca604e1030985b11b |
File details
Details for the file llama_index_llms_litellm-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: llama_index_llms_litellm-0.3.0-py3-none-any.whl
- Upload date:
- Size: 7.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b1422cf66d4e9f1e02256de53333ce3cbf46bf57a4c37c2eccbf47500f57ee0e |
|
MD5 | 1dd2b07803f18db9142185cfd6769c1d |
|
BLAKE2b-256 | fac5ba74b47af20283d838fd76a31664e6dfaaf424b05c8194f869c2ed2b441b |