Skip to main content

llama-index llms vercel ai gateway integration

Project description

LlamaIndex Llms Integration: Vercel AI Gateway

Installation

To install the required packages, run:

%pip install llama-index-llms-vercel-ai-gateway
!pip install llama-index

Setup

Initialize Vercel AI Gateway

You need to set either the environment variable VERCEL_AI_GATEWAY_API_KEY, VERCEL_OIDC_TOKEN, or pass your API key directly in the class constructor. Replace <your-api-key> with your actual API key:

from llama_index.llms.vercel_ai_gateway import VercelAIGateway
from llama_index.core.llms import ChatMessage

llm = VercelAIGateway(
    api_key="<your-api-key>",
    max_tokens=200000,
    context_window=64000,
    model="anthropic/claude-4-sonnet",
)

Generate Chat Responses

You can generate a chat response by sending a list of ChatMessage instances:

message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)

Streaming Responses

To stream responses, use the stream_chat method:

message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Complete with Prompt

You can also generate completions with a prompt using the complete method:

resp = llm.complete("Tell me a joke")
print(resp)

Streaming Completion

To stream completions, use the stream_complete method:

resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
    print(r.delta, end="")

Model Configuration

To use a specific model, you can specify it during initialization. For example, to use Anthropic's Claude 3 Sonnet model, you can set it like this:

llm = VercelAIGateway(model="anthropic/claude-4-sonnet")
resp = llm.complete("Write a story about a dragon who can code in Rust")
print(resp)

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/vercel-ai-gateway/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_vercel_ai_gateway-0.1.0.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_llms_vercel_ai_gateway-0.1.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_vercel_ai_gateway-0.1.0.tar.gz
Algorithm Hash digest
SHA256 eee0b2ca800ea03409c461148f128e8a517915a57fa31ff5d66b515f45f3b52d
MD5 4be61bb0d50f140707559b17e5504695
BLAKE2b-256 88f2c5cd56f0ab6dea8cc59b414afeafae56f0c2292ddf7395f7b5ff9c3dbcc3

See more details on using hashes here.

File details

Details for the file llama_index_llms_vercel_ai_gateway-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_vercel_ai_gateway-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2c38067fc3c25b3bf34411732e5b4ffc17933271a21dfa0ed92771f7be7f28d1
MD5 2398ff0df445f2051989dc35f5534cd9
BLAKE2b-256 d234f343c5bee069f0ba07220c394d138182043e76d8dc34be9c99d25dcbb876

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page