A RAG (Retrieval-Augmented Generation) system using Llama Index and ChromaDB
Project description
Llama Index Query Engine + Ollama Model to Create Your Own Knowledge Pool
This project is a robust and modular application that builds an efficient query engine using LlamaIndex, ChromaDB, and custom embeddings. It allows you to index documents from multiple directories and query them using natural language. You can connect to any local folders, and of course, you can connect OneDrive and iCloud folders.
Table of Contents
Usage
Running a Query
from ollama_rag import OllamaRAG
# Initialize the query engine with your configurations
engine = OllamaRAG(
model_name="llama3.2", # Replace with your Ollama model name
request_timeout=120.0,
embedding_model_name="BAAI/bge-large-en-v1.5", # Replace with your Hugging Face embedding model
trust_remote_code=True,
input_dirs=[
"/your/path/to/your/documents",
# Add more directories as needed
# if you are in wsl environment, make sure your path is like "/mnt/c/..."
# if you are in windows, use r"C:\Users\<YourUsername>\Documents", etc
# if you are in mac, use "/Users/<YourUsername>/Documents", etc
# if you want to find obsidian notes, find "iCloud~md~obsidian" in your icloud. Or you find it in your local.
],
required_exts=[
".txt", ".md", ".html", ".htm", ".xml", ".json", ".csv",
".pdf", ".doc", ".docx", ".rtf", ".ipynb",
".ppt", ".pptx", ".xls", ".xlsx", # you can remove required_exts by default to capture all supported extentions
]
)
# Update the index with new or updated documents
engine.update_index()
# Run a query
response = engine.query("can LLM generate creative contents?")
print(response)
Ouptut is a dict:
{'response': "Yes, the text suggests that LLMs (Large Language Models) can generate novel research ideas and even outperform human experts in terms of novelty. The authors claim that their AI agent generates ideas that are statistically more novel than those written by expert researchers. However, it's worth noting that the effectiveness of LLMs in generating creative content is a topic of ongoing debate, and not all studies have found similar results (e.g., Chakrabarty et al. (2024) found that AI writings are less creative than professional writers). Nevertheless, based on the provided context, it appears that LLMs can generate novel research ideas under certain conditions.",
'sources': [
{'document_id': 'Can LLMs Generate Novel Research Ideas.pdf',
'file_path': '/mnt/d/Paper/Can LLMs Generate Novel Research Ideas.pdf',
'page_number': '18',
'sheet_name': 'N/A',
'text_snippet': '9 Related Work\nResearch idea generation and execution . Several prior works explored methods to improve idea\ngeneration, such as iterative novelty boosting (Wang et al., 2024), multi-agent collaborati...'}
]
}
Features
- Modular Design: The project is organized into separate modules for easy maintenance and scalability.
- Efficient Indexing: Uses ChromaDB to store embeddings, allowing efficient indexing and querying.
- Incremental Updates: Only new or updated documents are indexed, improving performance.
- Multiple Directories Support: Indexes documents from multiple directories across different locations.
- Custom Embeddings: Utilizes custom embedding models for better performance.
- Error Handling: Gracefully handles missing directories or files and recreates the index as needed.
- Logging: Provides detailed logs for monitoring and debugging.
- Advanced Text-Based File Support: Supports a variety of text-based file formats, including: ".txt", ".md", ".html", ".htm", ".xml", ".json", ".csv", ".pdf", ".doc", ".docx", ".rtf", ".ipynb", ".ppt", ".pptx", ".xls", ".xlsx",
Project Structure
ollama_rag/
├── ollama_rag/
│ ├── __init__.py
│ ├── ollama_rag.py # Main class OllamaRAG
│ ├── models.py
│ ├── data_loader.py
│ ├── indexer.py
│ ├── query_engine.py
│ ├── prompts.py
│ ├── document_tracker.py
│
├── tests/
│ └── ... (test scripts)
├── setup.py
├── README.md
├── LICENSE
├── MANIFEST.in
└── requirements.txt
Prerequisites
- Python 3.7 or higher: Ensure you have Python installed.
- Git: For cloning the repository.
- Pip: Python package installer.
- ollama: https://ollama.com/download, install your selected model by the following example:
ollama pull llama3.2
- LibreOffice:
Required for converting .ppt files to .pptx when processing PowerPoint files. After conversion, I suggest you delete the ppt as it will always be converted and re-indexed again. Ubuntu/Debian:
sudo apt update
sudo apt install libreoffice
macOS (using Homebrew):
brew install --cask libreoffice
Windows: Download and install from the LibreOffice official website https://www.libreoffice.org/download/download-libreoffice/.
Installation
Install via PyPI (Recommended)
You can install ollama_rag
directly from PyPI:
pip install --upgrade ollama-rag
Install from Source
-
Clone the Repository
git clone https://github.com/Zakk-Yang/ollama-rag.git cd my_llama_project
-
Create a Virtual Environment (Recommended)
conda create -n env python=3.10 conda activate env
-
Install Dependencies and the Package
pip install .
Contributing
Contributions are welcome! Please follow these steps:
- Fork the Repository
- Create a Branch
git checkout -b feature/your-feature-name
- Commit Your Changes
git commit -am 'Add new feature'
4.Push to the Branch
git push origin feature/your-feature-name
License
The source code for the site is licensed under the MIT license, which you can find in the MIT-LICENSE.txt file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ollama_rag-0.4.1.tar.gz
.
File metadata
- Download URL: ollama_rag-0.4.1.tar.gz
- Upload date:
- Size: 13.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 738f1745081b7bb049ca07499aef3fe4b5b552cba2cddbe6afd8dbf0c3016aba |
|
MD5 | cbd0a4f45bf69452860cfef68f7183a9 |
|
BLAKE2b-256 | d90430c805536361244071812db52811e150f9b0874b96c1a9cc6a90b0acdf61 |
File details
Details for the file ollama_rag-0.4.1-py3-none-any.whl
.
File metadata
- Download URL: ollama_rag-0.4.1-py3-none-any.whl
- Upload date:
- Size: 13.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.10.15
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8199c27033aba21b343886964176afef4c97b536a109119e15d5975572989b84 |
|
MD5 | 18a83d842fe2614162a3c647a130563f |
|
BLAKE2b-256 | 8b59a6f74c67497e95492d9309349bc3b4418c6225434e4ef52a761524770a1e |