Skip to main content

llama-index llms opea integration

Project description

LlamaIndex Llms Integration: OPEA LLM

OPEA (Open Platform for Enterprise AI) is a platform for building, deploying, and scaling AI applications. As part of this platform, many core gen-ai components are available for deployment as microservices, including LLMs.

Visit https://opea.dev for more information, and their GitHub for the source code of the OPEA components.

Installation

  1. Install the required Python packages:
%pip install llama-index-llms-opea

Usage

from llama_index.core.llms import ChatMessage
from llama_index.llms.opea import OPEA

llm = OPEA(
    model="meta-llama/Meta-Llama-3.1-8B-Instruct",
    api_base="http://localhost:8080/v1",
    temperature=0.7,
    max_tokens=256,
    additional_kwargs={"top_p": 0.95},
)

# Complete a prompt
response = llm.complete("What is the capital of France?")
print(response)

# Stream a chat response
response = llm.stream_chat(
    [ChatMessage(role="user", content="What is the capital of France?")]
)
for chunk in response:
    print(chunk.delta, end="", flush=True)

All available methods include:

  • complete()
  • stream_complete()
  • chat()
  • stream_chat()

as well as async versions of the methods with the a prefix.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_opea-0.3.0.tar.gz (3.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_opea-0.3.0-py3-none-any.whl (3.3 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_opea-0.3.0.tar.gz.

File metadata

  • Download URL: llama_index_llms_opea-0.3.0.tar.gz
  • Upload date:
  • Size: 3.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_opea-0.3.0.tar.gz
Algorithm Hash digest
SHA256 8b9828d988229d0b85c231b0b81dc6fe25607205e69c1e4e280b7b5cc9dd22a7
MD5 4b612d2380c78ef9eea6d75a10d5f285
BLAKE2b-256 f81ab6028a072b1052fcc2359214098ba6f3ddb136cde35da203483f8ccaabd4

See more details on using hashes here.

File details

Details for the file llama_index_llms_opea-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_opea-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 3.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_opea-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 86f5a1ab25d2229e559b4705559cbb24d445516de2afcd4b217ca9227059acca
MD5 e2c2a07207a083eb006937607fa0830d
BLAKE2b-256 f1fd886c612bfe9b40d5aa196d60f3e85111663b030aaebf50d944a8ab2ff726

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page