Skip to main content

ChatBot with Retrieval Augmented Generation

Project description

ChatBot-RAG

A powerful chatbot implementation using Retrieval Augmented Generation (RAG) to provide context-aware responses based on your data.

Features

  • 🔍 Retrieval Augmented Generation: Enhances LLM responses with relevant context from your data
  • 🧠 Ollama Support: Run models locally with Ollama for privacy and customization
  • 🔗 LangChain Integration: Built on the powerful LangChain framework for advanced chains and pipelines

Installation

pip install chatbot-rag

Requirements

  • Python 3.12
  • Ollama (for local model hosting)
  • Tesseract-OCR (for image-based data extraction)

Tesseract Installation

To enable image-based data extraction, you need to install Tesseract-OCR.
Follow the installation instructions provided at this link.

After installation, ensure the tesseract executable is accessible in your system's PATH.
For example, on Windows, you can verify by running:

tesseract --version

Ollama Installation

Ollama is required for hosting models locally.
Refer to the official Ollama documentation for installation instructions: Ollama Installation Guide.

Once installed, verify it is working by running:

ollama --version

Use

Quick Start

from chatbot_rag.chat import Chatbot 
from chatbot_rag.RAG import RAG

# Use a specific Ollama model
rag = RAG(path="./data/")
rag()
bot = Chatbot(name="deepseek-r1:8b")

# Query with specific parameters
question = "Summarize my recent research on climate change"
context  = rag._search_context(question,k=5)
response = bot(context,question)
print(response)

Using temporal paths

from chatbot_rag.chat import Chatbot 
from chatbot_rag.RAG import RAG


with tempfile.TemporaryDirectory() as tmpdirname:
    persistent_dir = os.path.join(tmpdirname, "all_info/")
    os.makedirs(persistent_dir, exist_ok=True)
    rag = RAG(path="./data/",base_persist_path=persistent_dir)
    rag()
    chatbot = Chatbot(name="llama3.1:8b")

    question = "What is the main topic of the document?"
    context = rag._search_context(question)
    answer = chatbot(context=context, question=question)
    print(f"Answer: {answer}")

Using other preprocessing (PyMuPDFPreprocessing)

from chatbot_rag.chat import Chatbot 
from chatbot_rag.RAG import RAG
from src.chatbot_rag.preprocessing import PyMuPDFPreprocessing


kwargs = {"tesseract_path": "C:/Program Files/Tesseract-OCR/tesseract"}
rag = RAG(path="./data/",preprocessing=PyMuPDFPreprocessing,**kwargs)
rag()
chatbot = Chatbot(name="llama3.1:8b")

question = "What is the main topic of the document?"
context = rag._search_context(question)
answer = chatbot(context=context, question=question)
print(f"Question: {answer}")

By default, the system will attempt to extract information from images using Tesseract-OCR. You can disable image extraction by adding the following to the kwargs:

kwargs = {"extract_images": False}

and passing it directly to the RAG component.

Projects Using ChatBot-RAG

🌟 ChatBot-RAG App

ChatBot-RAG App is a chatbot framework leveraging Retrieval-Augmented Generation (RAG) to deliver context-aware responses.
It integrates 🔗 LangChain for advanced pipelines and supports 🧠 Ollama for local model hosting, ensuring enhanced privacy and customization.

Key Features:

  • 🔍 Context-Aware Responses: Uses RAG to provide accurate and relevant answers.
  • 🧠 Local Model Hosting: Powered by Ollama for privacy and flexibility.
  • 🔗 Advanced Pipelines: Built on LangChain for seamless integration and extensibility.

Explore the project here: ChatBot-RAG App

License

This project is licensed under the Apache 2.0 License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatbot_rag-0.1.8.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

chatbot_rag-0.1.8-py3-none-any.whl (11.5 kB view details)

Uploaded Python 3

File details

Details for the file chatbot_rag-0.1.8.tar.gz.

File metadata

  • Download URL: chatbot_rag-0.1.8.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.0

File hashes

Hashes for chatbot_rag-0.1.8.tar.gz
Algorithm Hash digest
SHA256 d2ad57b53918245de45739d57608c1a368540b183d119390635ddcda39597f05
MD5 4d0bcf7de30620539824724b17d81abe
BLAKE2b-256 0f1c91b4f4685e82ef897ef86a87c84669c159dc4a53d0b8b8195263dcace48c

See more details on using hashes here.

File details

Details for the file chatbot_rag-0.1.8-py3-none-any.whl.

File metadata

  • Download URL: chatbot_rag-0.1.8-py3-none-any.whl
  • Upload date:
  • Size: 11.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.0

File hashes

Hashes for chatbot_rag-0.1.8-py3-none-any.whl
Algorithm Hash digest
SHA256 b8383559a812a0f37d4c25ba9ea7bfe8720a9175098dc012bd0c5ae1cdf82464
MD5 792037ee2ba75247ddc79d8a33ed9b46
BLAKE2b-256 6e8861d593170ad854d833f80bbac7756f0fa05442bb6b2ed434dbccf538ef85

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page