Skip to main content

llama-index llms meta llama integration

Project description

LlamaIndex LLMs Integration: Llama API from Meta

Installation

  1. Install the required Python packages

    pip install llama-index-llms-meta
    pip install llama-index
    
  2. Get API Key from llama-api

    export LLAMA_API_KEY=your_api_key
    

Usage

Basic Chat

To simulate a chat with multiple messages:

from llama_index.core.llms import ChatMessage
from llama_index.llms.meta import LlamaLLM

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]
resp = LlamaLLM(
    model="Llama-3.3-8B-Instruct", api_key=os.environ["LLAMA_API_KEY"]
).chat(messages)
print(resp)

Example output (partial):

assistant: Yer lookin' fer me name, eh? Well, matey, me name be Captain Zephyr "Blackheart" McScurvy!

Streaming Chat

For a streamed conversation, use stream_chat:

from llama_index.core.llms import ChatMessage
from llama_index.llms.meta import LlamaLLM

messages = [
    ChatMessage(
        role="system", content="You are a pirate with a colorful personality"
    ),
    ChatMessage(role="user", content="What is your name"),
]

resp = LlamaLLM(
    model="Llama-3.3-8B-Instruct", api_key=os.environ["LLAMA_API_KEY"]
).stream_chat(messages)

for r in resp:
    print(r.delta, end="")

Example output (partial):

Yer lookin' fer me name, eh? Well, matey, me name be Captain Zephyr "Blackheart" McScurvy!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_meta-0.3.0.tar.gz (4.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_meta-0.3.0-py3-none-any.whl (3.6 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_meta-0.3.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_meta-0.3.0.tar.gz
  • Upload date:
  • Size: 4.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_meta-0.3.0.tar.gz
Algorithm Hash digest
SHA256 f258baef6117be309b69c9d36c5302105b611055a8543fd055aeb917e4331c93
MD5 2c7ca3ddcdcd1c997f2fc58ad9a0da5d
BLAKE2b-256 830aeab1b9e58bd7993a210aaf8a8d94e56fd0c43716c6092e590a61f3f18f8a

See more details on using hashes here.

File details

Details for the file llama_index_llms_meta-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_meta-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 3.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_meta-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1d7fb6ec25eb79e03133dbb41bb7e0248a274ea23aff492aeace4451b013a810
MD5 4b52fc40aa3342bb3b27ec7392de063e
BLAKE2b-256 a7d12b9ba75618674e3b523439b014e01399a56dd58d15b189c7c681f0e769c9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page