Skip to main content

An efficient python library for building AI applications using the Retrieval-Augmented Generation (RAG) pipeline.

Project description

CARAG: A Powerful Python Library to Develop AI Applications with RAG Pipeline

Supported python versions PEP8 License GitHub stars

weaviate source: www.weaviate.com

✨ Description

CARAG is a Python library leverages a hybrid Retrieval-Augmented Generation (RAG) approach along with semantic cache (memory) to efficiently store and retrieve embeddings. By combining dense, sparse, and late interaction embeddings, It offers a robust solution for managing large datasets (unstructured text files) to get relevant grounded responses generated by the pre-trained LLMs from Mistral API.

✨ Features

🚀 Hybrid RAG: Utilizes dense, sparse, and late interaction embeddings for enhanced performance.
🔌 Easy Integration: Simple API for storing and searching embeddings.
📄 PDF/CSV Support: Directly store embeddings from PDF/CSV documents.
🎉 Ground Generation from LLM Get synthesized responses from "mistral-large-latest".

Read more on the medium Blog

🌱 Getting Started

Prerequisites

  • PyMuPDF
  • Mistral
  • fastembed
  • qdrant_client
  • ipywidgets

🚀 Installation

To install CARAG, simply run: (latest version)

pip install carag==1.0.8

Set virtual environment

python3 -m venv venv
source venv/bin/activate  # On macOS/Linux
venv\Scripts\activate  # On Windows

Install dependencies

pip install -r requirements.txt

Create an .environment file

Create a file named .env in the root directory of your project. This file will store your API keys and other sensitive information.

import os
from dotenv import load_dotenv
load_env()

url=os.getenv(YOUR_QDRANT_URL)
api_key=os.getenv(YOUR_QDRANT_API_KEY)
mistral_api_key=os.getenv(YOUR_MISTRAL_API_KEY)

📦 Usage

from carag import *

Initiate pipeline objects

If there's no collection in the Qdrant DB, it creates collection name (e.g., collection_name='test')

rag = rag_pipe(url="YOUR_QDRANT_URL", 
      api_key="YOUR_QDRANT_API_KEY",
      collection_name="YOUR_COLLECTION_NAME")

gg = GroundGeneration(
      url="YOUR_QDRANT_URL", 
      api_key="YOUR_QDRANT_API_KEY",
      mistral_api_key="YOUR_MISTRAL_API_KEY",
      collection_name="YOUR_COLLECTION_NAME",
      llm_model_name="MISTRAL_LLM_NAME",
      temperature=0.7, max_tokens=2000
)

# collection with the chosen name will be created, if not exists

upload text chunks to collection

e.g., collection_name ="test"

example data

text_chunks = [
    {
        "text": "The EU AI Act prohibits certain uses of artificial intelligence (AI). These include AI systems that manipulate people's decisions or exploit their vulnerabilities, systems that evaluate or classify people based on their social behavior or personal traits, and systems that predict a person's risk of committing a crime.",
        "metadata": {"source": "prohibited AI practice", "page": 1}
    },
    {
        "text": "Article 4 of the AI Act requires providers and deployers of AI systems to ensure a sufficient level of AI literacy to their staff and anyone using the systems on their behalf. The article entered into application on 2 February 2025. Several organizations have anticipated and prepared themselves",
        "metadata": {"source": "Article 4", "page": 2}
    },
    {
        "text": "Banned AI applications in the EU include: Cognitive behavioral manipulation of people or specific vulnerable groups: for example voice-activated toys that encourage dangerous behavior in children",
        "metadata": {"source": "unacceptable risk", "page": 3}
    },
]

# indexes & stores embeddings from a list of key,value pairs of text chunks - List[Dict]
rag_pipe.upload_text_chunks(
  url="YOUR_QDRANT_URL", 
  api_key="YOUR_QDRANT_API_KEY",
  text_chunks="YOUR_TEXT_CHUNKS,
  collection_name="YOUR_COLLECTION_NAME",
  batch_size=1
)

Get the top search result for the query (If the collection has embedding vectors stored in the vector DB)

#### example:
top_k_scored_points = rag.invoke(url, api_key, "What are the key points of the European AI Act 2024?", 'test')

or

top_10_scored_points= gg.retrieve("What are the key points of the European AI Act 2024?", 'test',cache_first=True,top_k=10)

Get the top 3 responses / answers from the Mistral LLM

top_responses = gg.grounded_generation_from_llm(query="your_query")
# temperature=0 precise; temperature=1 random
answers = top_responses['top_results']
print(answers)

IMPORTANT NOTES

ONLY Mistral AI LLMs are supported as of now.

  • Qdrant offers a free tier with 4GB disk space on cloud. To generate your API key and (URL) endpoint, visit Qdrant.

  • Mistral AI offers a free tier with 1 billion tokens per month or 500K tokens per minute or 1 RPS.

🤝 Contributing

Feel free to contribute to the improvement in the source code by reporting bugs, suggesting features, or submitting pull requests.

QR code for feedback form and appointments

QR code

Don't forget to star (🌟) this repo to find it easier later.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

carag-1.0.8.tar.gz (27.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

carag-1.0.8-py3-none-any.whl (25.6 kB view details)

Uploaded Python 3

File details

Details for the file carag-1.0.8.tar.gz.

File metadata

  • Download URL: carag-1.0.8.tar.gz
  • Upload date:
  • Size: 27.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for carag-1.0.8.tar.gz
Algorithm Hash digest
SHA256 22719c8aad16542618d040d469ee9784a0d526eb8b1f333839309e5b2e803c11
MD5 2125a19eeb8cec2ea0e1d2fba065b648
BLAKE2b-256 f0be7c805faadbbf67d1bfcf9fe883cc5094d709fe27a40e2bb3a0872c49abbe

See more details on using hashes here.

File details

Details for the file carag-1.0.8-py3-none-any.whl.

File metadata

  • Download URL: carag-1.0.8-py3-none-any.whl
  • Upload date:
  • Size: 25.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for carag-1.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 dcfef37832badf1c7c7a41fad08081d27460e4c45bbbd27c43ed97bbdf650e67
MD5 9b072ba9bc3236e630201a0b08e9afbc
BLAKE2b-256 462b1863a1379897068d397367df176471b91d8fe739301f839f7eb825ae92e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page