Skip to main content

llama-index embeddings opea integration

Project description

LlamaIndex Embeddings Integration: OPEA Embeddings

OPEA (Open Platform for Enterprise AI) is a platform for building, deploying, and scaling AI applications. As part of this platform, many core gen-ai components are available for deployment as microservices, including LLMs.

Visit https://opea.dev for more information, and their GitHub for the source code of the OPEA components.

Installation

  1. Install the required Python packages:
%pip install llama-index-embeddings-opea

Usage

from llama_index.embeddings.opea import OPEAEmbedding

embed_model = OPEAEmbedding(
    model="<model_name>",
    api_base="http://localhost:8080/v1",
    embed_batch_size=10,
)

embeddings = embed_model.get_text_embedding("text")

embeddings = embed_model.get_text_embedding_batch(["text1", "text2"])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_embeddings_opea-0.3.0.tar.gz (3.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_embeddings_opea-0.3.0-py3-none-any.whl (3.5 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_embeddings_opea-0.3.0.tar.gz.

File metadata

  • Download URL: llama_index_embeddings_opea-0.3.0.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_embeddings_opea-0.3.0.tar.gz
Algorithm Hash digest
SHA256 ed1ba0791cca6d244ef48d4d694f5fb09c506068cad18cf03f904c6d649e2ff6
MD5 ace38139716bd998de225f89c85f2c37
BLAKE2b-256 14c25ddc7b376b30a28db286951ec55ceec401456cf0296496a5ce116909ae1e

See more details on using hashes here.

File details

Details for the file llama_index_embeddings_opea-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: llama_index_embeddings_opea-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 3.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.9 {"installer":{"name":"uv","version":"0.10.9","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_embeddings_opea-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 cf181a2fb75a378ed245cbbe35b5c4badfde200eb7007136dca3126af4c07fdc
MD5 d9bdf81c22c9695d420b115a1b29386a
BLAKE2b-256 0a7bc9268959190b4dc05f14296658a390b90dc1134572220ef648f18e70b1c1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page