Skip to main content

llama-index embeddings opea integration

Project description

LlamaIndex Embeddings Integration: OPEA Embeddings

OPEA (Open Platform for Enterprise AI) is a platform for building, deploying, and scaling AI applications. As part of this platform, many core gen-ai components are available for deployment as microservices, including LLMs.

Visit https://opea.dev for more information, and their GitHub for the source code of the OPEA components.

Installation

  1. Install the required Python packages:
%pip install llama-index-embeddings-opea

Usage

from llama_index.embeddings.opea import OPEAEmbedding

embed_model = OPEAEmbedding(
    model="<model_name>",
    api_base="http://localhost:8080/v1",
    embed_batch_size=10,
)

embeddings = embed_model.get_text_embedding("text")

embeddings = embed_model.get_text_embedding_batch(["text1", "text2"])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_embeddings_opea-0.3.1.tar.gz (3.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_embeddings_opea-0.3.1-py3-none-any.whl (3.5 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_embeddings_opea-0.3.1.tar.gz.

File metadata

  • Download URL: llama_index_embeddings_opea-0.3.1.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_embeddings_opea-0.3.1.tar.gz
Algorithm Hash digest
SHA256 f7101c485b7bd556d97ed66681e76be7838cca2b8fe079fb616f78cbd014ca66
MD5 ff1947f00f302fc828914a9799bc5790
BLAKE2b-256 4fdda00386e6f55e4b282e08dea6adee42f96b8ac8d4278aacc5fd016a16765b

See more details on using hashes here.

File details

Details for the file llama_index_embeddings_opea-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: llama_index_embeddings_opea-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 3.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_embeddings_opea-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 aa62644546b71df85ff455f01eb0c070d35e2b9d47ccf218c8c8a63646f6b681
MD5 db45bbf59e091c38e0d71d217ecfe8c0
BLAKE2b-256 0d33dddcb06e7926d4b15724e979560e53301c6aa1d630b9f7ae9009e56cfdc6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page