llama-index llms openai integration
Project description
LlamaIndex Llms Integration: Openai
Installation
To install the required package, run:
%pip install llama-index-llms-openai
Setup
- Set your OpenAI API key as an environment variable. You can replace
"sk-..."with your actual API key:
import os
os.environ["OPENAI_API_KEY"] = "sk-..."
Basic Usage
Generate Completions
To generate a completion for a prompt, use the complete method:
from llama_index.llms.openai import OpenAI
resp = OpenAI().complete("Paul Graham is ")
print(resp)
Chat Responses
To send a chat message and receive a response, create a list of ChatMessage instances and use the chat method:
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality."
),
ChatMessage(role="user", content="What is your name?"),
]
resp = OpenAI().chat(messages)
print(resp)
Streaming Responses
Stream Complete
To stream responses for a prompt, use the stream_complete method:
from llama_index.llms.openai import OpenAI
llm = OpenAI()
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
print(r.delta, end="")
Stream Chat
To stream chat responses, use the stream_chat method:
from llama_index.llms.openai import OpenAI
from llama_index.core.llms import ChatMessage
llm = OpenAI()
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality."
),
ChatMessage(role="user", content="What is your name?"),
]
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
Configure Model
You can specify a particular model when creating the OpenAI instance:
llm = OpenAI(model="gpt-3.5-turbo")
resp = llm.complete("Paul Graham is ")
print(resp)
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality."
),
ChatMessage(role="user", content="What is your name?"),
]
resp = llm.chat(messages)
print(resp)
Asynchronous Usage
You can also use asynchronous methods for completion:
from llama_index.llms.openai import OpenAI
llm = OpenAI(model="gpt-3.5-turbo")
resp = await llm.acomplete("Paul Graham is ")
print(resp)
Set API Key at a Per-Instance Level
If desired, you can have separate LLM instances use different API keys:
from llama_index.llms.openai import OpenAI
llm = OpenAI(model="gpt-3.5-turbo", api_key="BAD_KEY")
resp = OpenAI().complete("Paul Graham is ")
print(resp)
LLM Implementation example
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llama_index_llms_openai-0.7.3.tar.gz.
File metadata
- Download URL: llama_index_llms_openai-0.7.3.tar.gz
- Upload date:
- Size: 27.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
16c72a9eed24266afe25ebe97a16d9405865dda8e27a4d11b93d038ccc2d164a
|
|
| MD5 |
059d6bc73776624edfd049bba9c7494b
|
|
| BLAKE2b-256 |
23bbbe01df250251feb9ce8d94fbb3ef947ec8f41c98ac57a2766cce38562175
|
File details
Details for the file llama_index_llms_openai-0.7.3-py3-none-any.whl.
File metadata
- Download URL: llama_index_llms_openai-0.7.3-py3-none-any.whl
- Upload date:
- Size: 28.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f1ec398a44a6e7c513b86802e5fbd67cd01adf695155ee5034b899e537fe2226
|
|
| MD5 |
ee13eaf3b5e1a08d1a6d9346f895504a
|
|
| BLAKE2b-256 |
991fda6ec7e5f998fac3257def0b99a3f8a69dbcb17dc685dba65944e6fd3962
|