Skip to main content

DataStax RAGStack Graph Store

Project description

RAGStack Graph Store

Hybrid Graph Store combining vector similarity and edges between chunks.

Usage

  1. Pre-process your documents to populate metadata information.
  2. Create a Hybrid GraphStore and add your LangChain Documents.
  3. Retrieve documents from the GraphStore.

Populate Metadata

The Graph Store makes use of the following metadata fields on each Document:

  • content_id: If assigned, this specifies the unique ID of the Document. If not assigned, one will be generated. This should be set if you may re-ingest the same document so that it is overwritten rather than being duplicated.
  • link_tags: A set of LinkTags indicating how this node should be linked to other nodes.

Hyperlinks

To connect nodes based on hyperlinks, you can use the HtmlLinkEdgeExtractor as shown below:

from ragstack_knowledge_store.langchain.extractors import HtmlLinkEdgeExtractor

html_link_extractor = HtmlLinkEdgeExtractor()

for doc in documents:
    doc.metadata["content_id"] = doc.metadata["source"]

    # Add link tags from the page_content to the metadata.
    # Should be passed the HTML content as a string or BeautifulSoup.
    html_link_extractor.extract_one(doc, doc.page_content)

Store

import cassio
from langchain_openai import OpenAIEmbeddings
from ragstack_knowledge_store import GraphStore

cassio.init(auto=True)

graph_store = GraphStore(embeddings=OpenAIEmbeddings())

# Store the documents
graph_store.add_documents(documents)

Retrieve

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(model="gpt-4o")

# Retrieve and generate using the relevant snippets of the blog.
from langchain_core.runnables import RunnablePassthrough
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

# Depth 0 - don't traverse edges. equivalent to vector-only.
# Depth 1 - vector search plus 1 level of edges
retriever = graph_store.as_retriever(k=4, depth=1)

template = """You are a helpful technical support bot. You should provide complete answers explaining the options the user has available to address their problem. Answer the question based only on the following context:
{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)

def format_docs(docs):
    formatted = "\n\n".join(f"From {doc.metadata['content_id']}: {doc.page_content}" for doc in docs)
    return formatted


rag_chain = (
    {"context": retriever | format_docs, "question": RunnablePassthrough()}
    | prompt
    | llm
    | StrOutputParser()
)

Development

poetry install --with=dev

# Run Tests
poetry run pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ragstack_ai_knowledge_store-0.0.5.tar.gz (12.3 kB view hashes)

Uploaded Source

Built Distribution

ragstack_ai_knowledge_store-0.0.5-py3-none-any.whl (13.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page