Skip to main content

llama-index llms vercel ai gateway integration

Project description

LlamaIndex Llms Integration: Vercel AI Gateway

Installation

To install the required packages, run:

%pip install llama-index-llms-vercel-ai-gateway
!pip install llama-index

Setup

Initialize Vercel AI Gateway

You need to set either the environment variable VERCEL_AI_GATEWAY_API_KEY, VERCEL_OIDC_TOKEN, or pass your API key directly in the class constructor. Replace <your-api-key> with your actual API key:

from llama_index.llms.vercel_ai_gateway import VercelAIGateway
from llama_index.core.llms import ChatMessage

llm = VercelAIGateway(
    api_key="<your-api-key>",
    max_tokens=200000,
    context_window=64000,
    model="anthropic/claude-4-sonnet",
)

Generate Chat Responses

You can generate a chat response by sending a list of ChatMessage instances:

message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)

Streaming Responses

To stream responses, use the stream_chat method:

message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Complete with Prompt

You can also generate completions with a prompt using the complete method:

resp = llm.complete("Tell me a joke")
print(resp)

Streaming Completion

To stream completions, use the stream_complete method:

resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
    print(r.delta, end="")

Model Configuration

To use a specific model, you can specify it during initialization. For example, to use Anthropic's Claude 3 Sonnet model, you can set it like this:

llm = VercelAIGateway(model="anthropic/claude-4-sonnet")
resp = llm.complete("Write a story about a dragon who can code in Rust")
print(resp)

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/vercel-ai-gateway/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_vercel_ai_gateway-0.1.1.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_llms_vercel_ai_gateway-0.1.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_vercel_ai_gateway-0.1.1.tar.gz
Algorithm Hash digest
SHA256 b5b475dc4d5ce095b5032a59ae3587b93c9c45e6fa7a98098e7e00b6bab1aa89
MD5 96a3d625fb74b78535182ccb426dd43e
BLAKE2b-256 16ba9fdc99e540c5f865286ce93115bdabdfa23379e745a575fe71ceb3f72742

See more details on using hashes here.

File details

Details for the file llama_index_llms_vercel_ai_gateway-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_vercel_ai_gateway-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 be7821c44d9f71759a68676d0ddc4a4a3f43836def1bcd30f227737caee9e81e
MD5 dc9f945e102dd69f61aaef18b164ef1e
BLAKE2b-256 0c83d92096abe0034f61072ae708e29b1dbb0c058490c7f89df479b04e88b5c1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page