Skip to main content

llama-index vector_stores moorcheh integration

Project description

LlamaIndex Vector_Stores Integration: Moorcheh

Welcome to the Moorcheh Vector Store that integrates Llama-Index.

This module introduces support for Moorcheh, a semantic vector database developed by EdgeAI Innovations. Moorcheh enables fast and intelligent document retrieval using hybrid scoring and generative answering capabilities. The integration is implemented in accordance with the standard vector store interface defined by LlamaIndex and supports all core methods including add, query, delete, and generate_answer.

To see the integration in action, refer to the demonstration notebook: Google Colab Demo.

Getting started

To begin using the Moorcheh vector store, make sure to install the necessary packages:

pip install llama_index
pip install moorcheh_sdk

Example Usage

Here is a simple example demonstrating how to use the Moorcheh integration with LlamaIndex:

from llama_index.core import VectorStoreIndex
from llama_index.llama_index_integrations.vector_stores.llama_index_vector_stores_moorcheh.llama_index.vector_stores-moorcheh import base, init, utils

api_key = os.environ["MOORCHEH_API_KEY"]

documents = SimpleDirectoryReader("./your-directory").load_data()
__all__ = ["MoorchehVectorStore"]

# Creates a Moorcheh Vector Store with the following parameters
# For text-based namespaces, set namespace_type to "text" and vector_dimension to None
# For vector-based namespaces, set namespace_type to "vector" and vector_dimension to the dimension of your uploaded vectors
vector_store = MoorchehVectorStore(api_key=api_key, namespace="llamaindex_moorcheh", namespace_type="text", vector_dimension=None, add_sparse_vector=False, batch_size=100)

storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(
    documents, storage_context=storage_context
)

query_engine = index.as_query_engine()
response = query_engine.query("Which company has had the highest revenue in 2025 and why?")

display(Markdown(f"<b>{response}</b>"))
print("\n\n================================\n\n", response, "\n\n================================\n\n")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_vector_stores_moorcheh-0.2.0.tar.gz (7.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file llama_index_vector_stores_moorcheh-0.2.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.2.0.tar.gz
Algorithm Hash digest
SHA256 af8f82f25c0b7ff02d314f461af4d3411c482ddf2260a7a08313ed3425dfe78c
MD5 004fecf47065ff13b4b328c83185f88f
BLAKE2b-256 40eaee772b56521e6834eae2510552dd48b5f91f4811821b0f52af46adb78f5b

See more details on using hashes here.

File details

Details for the file llama_index_vector_stores_moorcheh-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_vector_stores_moorcheh-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ad6c3a073b89d52c32d06d56adfe3dbff50d160b413966596d1d4e62d202fadf
MD5 83d7fc418538fbdd1b2c11a187e1beee
BLAKE2b-256 a5e41e884be9c75b6638eceb1db7c4c023398026888c3a636dade36f6a6aa470

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page