Skip to main content

llama-index vector_stores moorcheh integration

Project description

LlamaIndex Vector_Stores Integration: Moorcheh

Welcome to the Moorcheh Vector Store that integrates Llama-Index.

This module introduces support for Moorcheh, a semantic vector database developed by EdgeAI Innovations. Moorcheh enables fast and intelligent document retrieval using hybrid scoring and generative answering capabilities. The integration is implemented in accordance with the standard vector store interface defined by LlamaIndex and supports all core methods including add, query, delete, and generate_answer.

To see the integration in action, refer to the demonstration notebook: Google Colab Demo.

Getting started

To begin using the Moorcheh vector store, make sure to install the necessary packages:

pip install llama_index
pip install moorcheh_sdk

Example Usage

Here is a simple example demonstrating how to use the Moorcheh integration with LlamaIndex:

from llama_index.core import VectorStoreIndex
from llama_index.llama_index_integrations.vector_stores.llama_index_vector_stores_moorcheh.llama_index.vector_stores-moorcheh import base, init, utils

api_key = os.environ["MOORCHEH_API_KEY"]

documents = SimpleDirectoryReader("./your-directory").load_data()
__all__ = ["MoorchehVectorStore"]

# Creates a Moorcheh Vector Store with the following parameters
# For text-based namespaces, set namespace_type to "text" and vector_dimension to None
# For vector-based namespaces, set namespace_type to "vector" and vector_dimension to the dimension of your uploaded vectors
vector_store = MoorchehVectorStore(api_key=api_key, namespace="llamaindex_moorcheh", namespace_type="text", vector_dimension=None, add_sparse_vector=False, batch_size=100)

storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context
)

query_engine = index.as_query_engine()
response = query_engine.query("Which company has had the highest revenue in 2025 and why?")

display(Markdown(f"<b>{response}</b>"))
print("\n\n================================\n\n", response, "\n\n================================\n\n")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_vector_stores_moorcheh-0.2.1.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_vector_stores_moorcheh-0.2.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.2.1.tar.gz
Algorithm Hash digest
SHA256 8e36a33711c3a6f1bf2d5006d96bb8807109726dacd7af756715db215124bfa1
MD5 cc8acdf3f76bda273728c7d902c8cba2
BLAKE2b-256 bb79e59aeaeec5848ae71e5e2dc7810b06f616618c4d74a9ec580cf97147b46a

See more details on using hashes here.

File details

Details for the file llama_index_vector_stores_moorcheh-0.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 fd6c5d98abbda40774f7ef3a565361943d3d137ee6f3743b200627322d1601aa
MD5 c792475d3b823482fa9af1d9f5a053ac
BLAKE2b-256 cfd0e491e3e270539c0fb504cddcd731a9edd73eed3565afacb0fd30e1e5b2d6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page