Skip to main content

llama-index llms opea integration

Project description

LlamaIndex Llms Integration: OPEA LLM

OPEA (Open Platform for Enterprise AI) is a platform for building, deploying, and scaling AI applications. As part of this platform, many core gen-ai components are available for deployment as microservices, including LLMs.

Visit https://opea.dev for more information, and their GitHub for the source code of the OPEA components.

Installation

  1. Install the required Python packages:
%pip install llama-index-llms-opea

Usage

from llama_index.core.llms import ChatMessage
from llama_index.llms.opea import OPEA

llm = OPEA(
    model="meta-llama/Meta-Llama-3.1-8B-Instruct",
    api_base="http://localhost:8080/v1",
    temperature=0.7,
    max_tokens=256,
    additional_kwargs={"top_p": 0.95},
)

# Complete a prompt
response = llm.complete("What is the capital of France?")
print(response)

# Stream a chat response
response = llm.stream_chat(
    [ChatMessage(role="user", content="What is the capital of France?")]
)
for chunk in response:
    print(chunk.delta, end="", flush=True)

All available methods include:

  • complete()
  • stream_complete()
  • chat()
  • stream_chat()

as well as async versions of the methods with the a prefix.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_opea-0.2.1.tar.gz (3.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_opea-0.2.1-py3-none-any.whl (3.3 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_opea-0.2.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_opea-0.2.1.tar.gz
Algorithm Hash digest
SHA256 83997c08c12098e6ff5f0b34d8626ec36b22c008aedf57767937f0d570cc7c3b
MD5 d98f7eef50d3eedba1e7476285a58c64
BLAKE2b-256 7322e2f235c1a2636f7217f24a6ada84d08f52248330155da74e2bbf13c6877a

See more details on using hashes here.

File details

Details for the file llama_index_llms_opea-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_opea-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a082de4f2cc88625e0231da05de81055ebb2aa5364ad0fa7b7722836618989e0
MD5 ac248fe752aa59509edb00427821d21d
BLAKE2b-256 ab3d38e7badc4c8dcaf2040f29336540d5bf3146a14ad3ebef0be77534a724aa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page