Skip to main content

llama-index vector_stores moorcheh integration

Project description

LlamaIndex Vector_Stores Integration: Moorcheh

Welcome to the Moorcheh Vector Store that integrates Llama-Index.

This module introduces support for Moorcheh, a semantic vector database developed by EdgeAI Innovations. Moorcheh enables fast and intelligent document retrieval using hybrid scoring and generative answering capabilities. The integration is implemented in accordance with the standard vector store interface defined by LlamaIndex and supports all core methods including add, query, delete, and generate_answer.

To see the integration in action, refer to the demonstration notebook: Google Colab Demo.

Getting started

To begin using the Moorcheh vector store, make sure to install the necessary packages:

pip install llama_index
pip install moorcheh_sdk

Example Usage

Here is a simple example demonstrating how to use the Moorcheh integration with LlamaIndex:

from llama_index.core import VectorStoreIndex
from llama_index.llama_index_integrations.vector_stores.llama_index_vector_stores_moorcheh.llama_index.vector_stores-moorcheh import base, init, utils

api_key = os.environ["MOORCHEH_API_KEY"]

documents = SimpleDirectoryReader("./your-directory").load_data()
__all__ = ["MoorchehVectorStore"]

# Creates a Moorcheh Vector Store with the following parameters
# For text-based namespaces, set namespace_type to "text" and vector_dimension to None
# For vector-based namespaces, set namespace_type to "vector" and vector_dimension to the dimension of your uploaded vectors
vector_store = MoorchehVectorStore(api_key=api_key, namespace="llamaindex_moorcheh", namespace_type="text", vector_dimension=None, add_sparse_vector=False, batch_size=100)

storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context
)

query_engine = index.as_query_engine()
response = query_engine.query("Which company has had the highest revenue in 2025 and why?")

display(Markdown(f"<b>{response}</b>"))
print("\n\n================================\n\n", response, "\n\n================================\n\n")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_vector_stores_moorcheh-0.1.0.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_vector_stores_moorcheh-0.1.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e07584e39a8e9d0e4761d8792e66e9699e8a03183625c15b46692c152e52cdec
MD5 74eb8f6145e4a72a7f4a53d52fc1d418
BLAKE2b-256 6baedd844496f6679ba65eb2844a2a0f748df8825d085d50a91e3da788f80f83

See more details on using hashes here.

File details

Details for the file llama_index_vector_stores_moorcheh-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 07e184b1d52bcc44837fd520f1511da8aa437b49ed702f97d011f4789e92d69c
MD5 78bbcb41517bf01642f16d5cd31c1185
BLAKE2b-256 73ed9e7ff97135256b8e926870aa1cfd820e975b382dfb6532271f2a55ef2d9b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page