Skip to main content

llama-index llms llama api integration

Project description

LlamaIndex Llms Integration: Llama Api

Prerequisites

  1. API Key: Obtain an API key from Llama API.
  2. Python 3.x: Ensure you have Python installed on your system.

Installation

  1. Install the required Python packages:

    %pip install llama-index-program-openai
    %pip install llama-index-llms-llama-api
    !pip install llama-index
    

Basic Usage

Import Required Libraries

from llama_index.llms.llama_api import LlamaAPI
from llama_index.core.llms import ChatMessage

Initialize LlamaAPI

Set up the API key:

api_key = "LL-your-key"
llm = LlamaAPI(api_key=api_key)

Complete with a Prompt

Generate a response using a prompt:

resp = llm.complete("Paul Graham is ")
print(resp)

Chat with a List of Messages

Interact with the model using a chat interface:

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
print(resp)

Function Calling

Define a function using Pydantic and call it through LlamaAPI:

from pydantic import BaseModel
from llama_index.core.llms.openai_utils import to_openai_function


class Song(BaseModel):
    """A song with name and artist"""

    name: str
    artist: str


song_fn = to_openai_function(Song)
response = llm.complete("Generate a song", functions=[song_fn])
function_call = response.additional_kwargs["function_call"]
print(function_call)

Structured Data Extraction

Define schemas for structured output using Pydantic:

from pydantic import BaseModel
from typing import List


class Song(BaseModel):
    """Data model for a song."""

    title: str
    length_mins: int


class Album(BaseModel):
    """Data model for an album."""

    name: str
    artist: str
    songs: List[Song]

Define the prompt template for extracting structured data:

from llama_index.program.openai import OpenAIPydanticProgram

prompt_template_str = """\
Extract album and songs from the text provided.
For each song, make sure to specify the title and the length_mins.
{text}
"""

llm = LlamaAPI(api_key=api_key, temperature=0.0)

program = OpenAIPydanticProgram.from_defaults(
    output_cls=Album,
    llm=llm,
    prompt_template_str=prompt_template_str,
    verbose=True,
)

Run Program to Get Structured Output

Execute the program to extract structured data from the provided text:

output = program(
    text="""
    "Echoes of Eternity" is a compelling and thought-provoking album, skillfully crafted by the renowned artist, Seraphina Rivers. \
    This captivating musical collection takes listeners on an introspective journey, delving into the depths of the human experience \
    and the vastness of the universe. With her mesmerizing vocals and poignant songwriting, Seraphina Rivers infuses each track with \
    raw emotion and a sense of cosmic wonder. The album features several standout songs, including the hauntingly beautiful "Stardust \
    Serenade," a celestial ballad that lasts for six minutes, carrying listeners through a celestial dreamscape. "Eclipse of the Soul" \
    captivates with its enchanting melodies and spans over eight minutes, inviting introspection and contemplation. Another gem, "Infinity \
    Embrace," unfolds like a cosmic odyssey, lasting nearly ten minutes, drawing listeners deeper into its ethereal atmosphere. "Echoes of Eternity" \
    is a masterful testament to Seraphina Rivers' artistic prowess, leaving an enduring impact on all who embark on this musical voyage through \
    time and space.
    """
)

Output Example

You can print the structured output like this:

print(output)

LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/llama_api/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_llama_api-0.3.0.tar.gz (4.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file llama_index_llms_llama_api-0.3.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_llama_api-0.3.0.tar.gz
Algorithm Hash digest
SHA256 55f7cebb7ccefa296a5ef9f7832b5623f95772ec4afcfb760c9162b598114a32
MD5 88bc37397fbea17470508138e95cf634
BLAKE2b-256 d0a1dcb10b69ae518d44205c90eef9194154fb842230e806048fd7c01336186c

See more details on using hashes here.

File details

Details for the file llama_index_llms_llama_api-0.3.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_llama_api-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e86995d5b018de3f8bea85f09b65d0420db9f45dd326d7b92887799293ba0cd7
MD5 52d112e0053e2645658f85647c72ded3
BLAKE2b-256 ffa00c153bd9735a6f16ef0e0cd26a61bef2c417f5cd50d03e2c24cdc9d62ea3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page