llama-index llms perplexity integration
Project description
LlamaIndex Llms Integration: Perplexity
Installation
To install the required packages, run:
%pip install llama-index-llms-perplexity
!pip install llama-index
Setup
Import Libraries and Configure API Key
Import the necessary libraries and set your Perplexity API key:
from llama_index.llms.perplexity import Perplexity
pplx_api_key = "your-perplexity-api-key" # Replace with your actual API key
Initialize the Perplexity LLM
Create an instance of the Perplexity LLM with your API key and desired model settings:
llm = Perplexity(
api_key=pplx_api_key, model="mistral-7b-instruct", temperature=0.5
)
Chat Example
Sending a Chat Message
You can send a chat message using the chat
method. Here’s how to do that:
from llama_index.core.llms import ChatMessage
messages_dict = [
{"role": "system", "content": "Be precise and concise."},
{"role": "user", "content": "Tell me 5 sentences about Perplexity."},
]
messages = [ChatMessage(**msg) for msg in messages_dict]
# Get response from the model
response = llm.chat(messages)
print(response)
Async Chat
To send messages asynchronously, you can use the achat
method:
response = await llm.achat(messages)
print(response)
Stream Chat
For streaming responses, you can use the stream_chat
method:
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
Async Stream Chat
To stream responses asynchronously, use the astream_chat
method:
resp = await llm.astream_chat(messages)
async for delta in resp:
print(delta.delta, end="")
LLM Implementation example
https://docs.llamaindex.ai/en/stable/examples/llm/perplexity/
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_index_llms_perplexity-0.3.1.tar.gz
.
File metadata
- Download URL: llama_index_llms_perplexity-0.3.1.tar.gz
- Upload date:
- Size: 4.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 12d85b8533544a43734df79749582d2857b607bb2ff617f4432ecfd3759a6938 |
|
MD5 | 5ac551e14c20c7222d714a22225fd8e2 |
|
BLAKE2b-256 | 0587d77d7fdc0123e8f9c7efa6e273c91121c63baf9a50ec996ab274e8c3009c |
File details
Details for the file llama_index_llms_perplexity-0.3.1-py3-none-any.whl
.
File metadata
- Download URL: llama_index_llms_perplexity-0.3.1-py3-none-any.whl
- Upload date:
- Size: 4.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.5.0-1025-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e1d442802845358bcc76cfec6270763ccbcbd3b99a37618fed1e3044a20cbe7a |
|
MD5 | 41f49416932072320cf0a979b57083c1 |
|
BLAKE2b-256 | dbd9781b5b13a8925bd63304bd3f362cfe22d05436a65c41c90e24b9b7d1430a |