Skip to main content

llama-index vector_stores moorcheh integration

Project description

LlamaIndex Vector_Stores Integration: Moorcheh

Welcome to the Moorcheh Vector Store that integrates Llama-Index.

This module introduces support for Moorcheh, a semantic vector database developed by EdgeAI Innovations. Moorcheh enables fast and intelligent document retrieval using hybrid scoring and generative answering capabilities. The integration is implemented in accordance with the standard vector store interface defined by LlamaIndex and supports all core methods including add, query, delete, and generate_answer.

To see the integration in action, refer to the demonstration notebook: Google Colab Demo.

Getting started

To begin using the Moorcheh vector store, make sure to install the necessary packages:

pip install llama_index
pip install moorcheh_sdk

Example Usage

Here is a simple example demonstrating how to use the Moorcheh integration with LlamaIndex:

from llama_index.core import VectorStoreIndex
from llama_index.llama_index_integrations.vector_stores.llama_index_vector_stores_moorcheh.llama_index.vector_stores-moorcheh import base, init, utils

api_key = os.environ["MOORCHEH_API_KEY"]

documents = SimpleDirectoryReader("./your-directory").load_data()
__all__ = ["MoorchehVectorStore"]

# Creates a Moorcheh Vector Store with the following parameters
# For text-based namespaces, set namespace_type to "text" and vector_dimension to None
# For vector-based namespaces, set namespace_type to "vector" and vector_dimension to the dimension of your uploaded vectors
vector_store = MoorchehVectorStore(api_key=api_key, namespace="llamaindex_moorcheh", namespace_type="text", vector_dimension=None, add_sparse_vector=False, batch_size=100)

storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context
)

query_engine = index.as_query_engine()
response = query_engine.query("Which company has had the highest revenue in 2025 and why?")

display(Markdown(f"<b>{response}</b>"))
print("\n\n================================\n\n", response, "\n\n================================\n\n")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_vector_stores_moorcheh-0.1.1.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_vector_stores_moorcheh-0.1.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.1.1.tar.gz
Algorithm Hash digest
SHA256 055c168e66a6d2d2cb57b4285ac35212195b2d06e798679c545a353ca623cead
MD5 feae9ad324a9d8282abd74d04a6b1b0d
BLAKE2b-256 bc38f0a5a5c9016551ef3d28f9e34100a87cae6e8015c10589ff1df801e0dd91

See more details on using hashes here.

File details

Details for the file llama_index_vector_stores_moorcheh-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 488866498261c9f2fd10e72d7e86d4a777e6b39d9619fcbda8c9ffa599db89c2
MD5 e28343b3fca6c5ed44082048c7d0065e
BLAKE2b-256 57a64dbf3c438961c418991dba71165017c014a2f6522a7a3f4f65efb433d9d8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page