llama-index llms perplexity integration
Project description
LlamaIndex Llms Integration: Perplexity
Installation
To install the required packages, run:
%pip install llama-index-llms-perplexity
!pip install llama-index
Setup
Import Libraries and Configure API Key
Import the necessary libraries and set your Perplexity API key:
from llama_index.llms.perplexity import Perplexity
pplx_api_key = "your-perplexity-api-key" # Replace with your actual API key
Initialize the Perplexity LLM
Create an instance of the Perplexity LLM with your API key and desired model settings:
llm = Perplexity(
api_key=pplx_api_key, model="mistral-7b-instruct", temperature=0.5
)
Chat Example
Sending a Chat Message
You can send a chat message using the chat
method. Here’s how to do that:
from llama_index.core.llms import ChatMessage
messages_dict = [
{"role": "system", "content": "Be precise and concise."},
{"role": "user", "content": "Tell me 5 sentences about Perplexity."},
]
messages = [ChatMessage(**msg) for msg in messages_dict]
# Get response from the model
response = llm.chat(messages)
print(response)
Async Chat
To send messages asynchronously, you can use the achat
method:
response = await llm.achat(messages)
print(response)
Stream Chat
For streaming responses, you can use the stream_chat
method:
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")
Async Stream Chat
To stream responses asynchronously, use the astream_chat
method:
resp = await llm.astream_chat(messages)
async for delta in resp:
print(delta.delta, end="")
LLM Implementation example
https://docs.llamaindex.ai/en/stable/examples/llm/perplexity/
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_index_llms_perplexity-0.3.0.tar.gz
.
File metadata
- Download URL: llama_index_llms_perplexity-0.3.0.tar.gz
- Upload date:
- Size: 4.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 38dee0d1cb67b8e9e0accd00d1b3f42c875197fbe482c2523fc2babb5a8216da |
|
MD5 | 2e8092d53bc732ed30b94a4143b0813d |
|
BLAKE2b-256 | a9c98f031ee876128935623387fed6e32e25ad8952444645a7fc5ea2673d518f |
File details
Details for the file llama_index_llms_perplexity-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: llama_index_llms_perplexity-0.3.0-py3-none-any.whl
- Upload date:
- Size: 5.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e14a5b00c79d9b9d3d1a3f14da988a07a0c8befc8aaa00b219630fbf997f1697 |
|
MD5 | 00ed4aea9e8291be6c8b3caa151ef382 |
|
BLAKE2b-256 | 5412ef48de642a410c82350be6f791d0a7c133a63c9e7d34053811ca7760aab9 |