llama-index llms friendli integration
Project description
LlamaIndex Llms Integration: Friendli
Installation
-
Install the required Python packages:
%pip install llama-index-llms-friendli !pip install llama-index
-
Set the Friendli token as an environment variable:
%env FRIENDLI_TOKEN=your_token_here
Usage
Basic Chat
To generate a chat response, use the following code:
from llama_index.llms.friendli import Friendli
from llama_index.core.llms import ChatMessage, MessageRole
llm = Friendli()
message = ChatMessage(role=MessageRole.USER, content="Tell me a joke.")
resp = llm.chat([message])
print(resp)
Streaming Responses
To stream chat responses in real-time:
resp = llm.stream_chat([message])
for r in resp:
print(r.delta, end="")
Asynchronous Chat
For asynchronous chat interactions, use the following:
resp = await llm.achat([message])
print(resp)
Async Streaming
To handle async streaming of chat responses:
resp = await llm.astream_chat([message])
async for r in resp:
print(r.delta, end="")
Complete with a Prompt
To generate a completion based on a prompt:
prompt = "Draft a cover letter for a role in software engineering."
resp = llm.complete(prompt)
print(resp)
Streaming Completion
To stream completions in real-time:
resp = llm.stream_complete(prompt)
for r in resp:
print(r.delta, end="")
Async Completion
To handle async completions:
resp = await llm.acomplete(prompt)
print(resp)
Async Streaming Completion
For async streaming of completions:
resp = await llm.astream_complete(prompt)
async for r in resp:
print(r.delta, end="")
Model Configuration
To configure a specific model:
llm = Friendli(model="llama-2-70b-chat")
resp = llm.chat([message])
print(resp)
LLM Implementation example
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_index_llms_friendli-0.2.2.tar.gz
.
File metadata
- Download URL: llama_index_llms_friendli-0.2.2.tar.gz
- Upload date:
- Size: 4.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.8.0-1014-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | dc4617ef80d1f5f2d2e0ac12f4aed9f6249f39033c0e9c12b952d51c5fd5ca07 |
|
MD5 | 5ac86a8f79580d17d6465d809273f519 |
|
BLAKE2b-256 | 1f4774db6120097630756df6a436cacec1b097e378f4de88f967e1c7c4a07097 |
File details
Details for the file llama_index_llms_friendli-0.2.2-py3-none-any.whl
.
File metadata
- Download URL: llama_index_llms_friendli-0.2.2-py3-none-any.whl
- Upload date:
- Size: 4.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.10.12 Linux/6.8.0-1014-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4a553c2e5940c167cbd70caed9f50717a4e0ed47605b9045f90b32ae7cc083f8 |
|
MD5 | 291e792f448d888444ab3e8ee52c603f |
|
BLAKE2b-256 | 4268fa00002be78de201eb6366a58b17c1d02bc07ced20df3d1eebb85f08589e |