Skip to main content

llama-index llms friendli integration

Project description

LlamaIndex Llms Integration: Friendli

Installation

  1. Install the required Python packages:

    %pip install llama-index-llms-friendli
    !pip install llama-index
    
  2. Set the Friendli token as an environment variable:

    %env FRIENDLI_TOKEN=your_token_here
    

Usage

Basic Chat

To generate a chat response, use the following code:

from llama_index.llms.friendli import Friendli
from llama_index.core.llms import ChatMessage, MessageRole

llm = Friendli()

message = ChatMessage(role=MessageRole.USER, content="Tell me a joke.")
resp = llm.chat([message])
print(resp)

Streaming Responses

To stream chat responses in real-time:

resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Asynchronous Chat

For asynchronous chat interactions, use the following:

resp = await llm.achat([message])
print(resp)

Async Streaming

To handle async streaming of chat responses:

resp = await llm.astream_chat([message])
async for r in resp:
    print(r.delta, end="")

Complete with a Prompt

To generate a completion based on a prompt:

prompt = "Draft a cover letter for a role in software engineering."
resp = llm.complete(prompt)
print(resp)

Streaming Completion

To stream completions in real-time:

resp = llm.stream_complete(prompt)
for r in resp:
    print(r.delta, end="")

Async Completion

To handle async completions:

resp = await llm.acomplete(prompt)
print(resp)

Async Streaming Completion

For async streaming of completions:

resp = await llm.astream_complete(prompt)
async for r in resp:
    print(r.delta, end="")

Model Configuration

To configure a specific model:

llm = Friendli(model="llama-2-70b-chat")
resp = llm.chat([message])
print(resp)

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/friendli/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_friendli-0.3.0.tar.gz (4.1 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file llama_index_llms_friendli-0.3.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_friendli-0.3.0.tar.gz
  • Upload date:
  • Size: 4.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.11.10 Darwin/22.3.0

File hashes

Hashes for llama_index_llms_friendli-0.3.0.tar.gz
Algorithm Hash digest
SHA256 9ebb5b24c47489f42b7b918631e9f104170899a65582e8e2915309cae3799cf9
MD5 70ddb1b3b18cf44014cf6fba89b3ceec
BLAKE2b-256 ea3daddce8043690631a2db9c9ea629cfd380779a2b574e882da50cfe5d18090

See more details on using hashes here.

File details

Details for the file llama_index_llms_friendli-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_friendli-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 14f1dab96634aabfd85db9552f767f188885d382b6d1ddc7706c1c76d77ab0f0
MD5 fd7abd7950d8a1c5cf0358a1b9164ca6
BLAKE2b-256 68a67468edad461b46d4e91e209a626879cdad2fc968f1f967bf6cee29e6fd85

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page