Skip to main content

Ragie integration for LangChain

Project description

langchain-ragie

Ragie (https://ragie.ai) integration for LangChain

Install

pip install langchain-ragie

Usage

If you need asyncio, see this example..

from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough

import langchain_ragie

template = """Answer the question based only on the following context:

{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)
model = ChatOpenAI()
retriever = langchain_ragie.RagieRetriever()


def format_docs(docs):
    return "\n\n".join([d.page_content for d in docs])


chain = (
    {"context": retriever | format_docs, "question": RunnablePassthrough()}
    | prompt
    | model
    | StrOutputParser()
)

response = chain.invoke("What do the besties think about Davos?")

print(response)

License

Distributed under the MIT License. See LICENSE.txt for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langchain_ragie-0.3.0.tar.gz (3.3 kB view hashes)

Uploaded Source

Built Distribution

langchain_ragie-0.3.0-py3-none-any.whl (4.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page