Skip to main content

llama-index llms litellm integration

Project description

LlamaIndex Llms Integration: Litellm

Installation

  1. Install the required Python packages:

    %pip install llama-index-llms-litellm
    !pip install llama-index
    

Usage

Import Required Libraries

import os
from llama_index.llms.litellm import LiteLLM
from llama_index.core.llms import ChatMessage

Set Up Environment Variables

Set your API keys as environment variables:

os.environ["OPENAI_API_KEY"] = "your-api-key"
os.environ["COHERE_API_KEY"] = "your-api-key"

Example: OpenAI Call

To interact with the OpenAI model:

message = ChatMessage(role="user", content="Hey! how's it going?")
llm = LiteLLM("gpt-3.5-turbo")
chat_response = llm.chat([message])
print(chat_response)

Example: Cohere Call

To interact with the Cohere model:

llm = LiteLLM("command-nightly")
chat_response = llm.chat([message])
print(chat_response)

Example: Chat with System Message

To have a chat with a system role:

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="Tell me a story"),
]
resp = LiteLLM("gpt-3.5-turbo").chat(messages)
print(resp)

Streaming Responses

To use the streaming feature with stream_complete:

llm = LiteLLM("gpt-3.5-turbo")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")

Streaming Chat Example

To stream chat messages:

llm = LiteLLM("gpt-3.5-turbo")
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")

Asynchronous Example

For asynchronous calls, use:

llm = LiteLLM("gpt-3.5-turbo")
resp = await llm.acomplete("Paul Graham is ")
print(resp)

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/litellm/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_litellm-0.7.1.tar.gz (10.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_litellm-0.7.1-py3-none-any.whl (11.2 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_litellm-0.7.1.tar.gz.

File metadata

  • Download URL: llama_index_llms_litellm-0.7.1.tar.gz
  • Upload date:
  • Size: 10.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_litellm-0.7.1.tar.gz
Algorithm Hash digest
SHA256 1e014d9fb85b1fc4d3d5abcdc8c2639b41f68b8269b92d1d0fcbc34e662781eb
MD5 ce1031a83d8f97c2bcb235e7bba710e7
BLAKE2b-256 2dbfa49cd74804b3d440efee457d42ade1b97468a71edb46a091f98f82f7a848

See more details on using hashes here.

File details

Details for the file llama_index_llms_litellm-0.7.1-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_litellm-0.7.1-py3-none-any.whl
  • Upload date:
  • Size: 11.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_litellm-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 abb577d1ad8ac8e3acef0681dfe5b775a0e692bbccf7ee78333a0f5fccae6609
MD5 bef2abc7dd6385adda7e2753c225f7be
BLAKE2b-256 ff7220f8d8acfd8ccee519b75f3394c1762eac7986dd5a6d9cabacb403933978

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page