Skip to main content

RAG workflow Optimizer based on Bayesian Optimization

Project description

RAGOpt Logo

RAGOpt eliminates manual hyperparameter tuning in your RAG pipelines with Bayesian optimization 🚀

Publish passing PyPI version Python 3.11 MIT License



RAGOpt is a Python framework to optimize Retrieval-Augmented Generation (RAG) pipelines. It eliminates manual hyperparameter tuning using Bayesian optimization, automatically finding the best configuration for your dataset and use case.

Key Features

  • Optimizes 20+ RAG hyperparameters including chunk size, overlap, embedding strategies, and LLM selection.
  • Flexible with any LangChain-compatible model or provider.
  • Partially Opinionated - Smart defaults with full flexibility for customization
  • Generates Pareto-optimal configurations for your specific data.
  • Comprehensive Metrics - Quality (precision, recall, faithfulness), performance (latency, cost), and safety (toxicity, bias)

Installation

pip install rag-opt

🔥 Quick Start

1. Generate Training Questions

from langchain.chat_models import init_chat_model
from rag_opt.rag import DatasetGenerator

# Initialize LLM
llm = init_chat_model(
    model="gpt-3.5-turbo",
    model_provider="openai",
    api_key="sk-***"
)

# Generate Q&A pairs from your documents
data_gen = DatasetGenerator(llm, dataset_path="./data")
dataset = data_gen.generate(10)
dataset.to_json("./rag_dataset.json")

2. Define Your Search Space

Create rag_config.yaml:

chunk_size:
  bounds: [512, 1024]
  dtype: int

max_tokens:
  bounds: [256, 512]
  dtype: int

chunk_overlap:
  bounds: [0, 200]
  dtype: int

temperature:
  bounds: [0.0, 1.0]
  dtype: float

search_type:
  choices: ["similarity", "mmr", "hybrid"]

vector_store:
  choices:
    faiss: {}
    pinecone:
      api_key: "YOUR_API_KEY"
      index_name: "your-index"

embedding:
  choices:
    openai:
      api_key: "YOUR_API_KEY"
      models:
        - "text-embedding-3-large"
        - "text-embedding-ada-002"
    huggingface:
      models:
        - "all-MiniLM-L6-v2"

llm:
  choices:
    openai:
      api_key: "YOUR_API_KEY"
      models:
        - "gpt-4o"
        - "gpt-3.5-turbo"

k:
  bounds: [1, 10]
  dtype: int

use_reranker: false

3. Run Optimization

from rag_opt.dataset import TrainDataset
from rag_opt.optimizer import Optimizer

# Load dataset
train_dataset = TrainDataset.from_json("rag_dataset.json")

# Initialize optimizer
optimizer = Optimizer(
    train_dataset=train_dataset,
    config_path="rag_config.yaml",
    verbose=True
)

# Find optimal configuration
best_config = optimizer.optimize(n_trials=50, best_one=True)
best_config.to_json()

4. Get Your Optimized Config

Output example:

{
"chunk_size": 500
"max_tokens": 100
"chunk_overlap": 200
"search_type": "hybrid"
"k": "1"
"temperature": 1.0
"embedding":
  "provider": "openai"
  "model": "text-embedding-3-large"
"llm":
  "provider": "openai"
  "model": "gpt-4o"
"vector_store":
  "provider": "faiss"
"use_reranker": "true"
}

How It Works

  1. Dataset Generation - Create synthetic Q&A pairs from your documents
  2. Search Space Definition - Configure which parameters to optimize
  3. Bayesian Optimization - Intelligently sample and evaluate configurations
  4. Multi-Metric Evaluation - Assess quality, performance, and safety
  5. Pareto-Optimal Results - Get the best configurations for your priorities

RAGOpt vs Alternatives

  • AutoRAG: Uses Bayesian optimization instead of grid search like AutoRAG
  • Ragas: Flexible evaluation framework, not rigid—bring your own metrics
  • Manual Tuning: Systematic, data-driven approach saves time and improves results

License

This project is licensed under the terms of the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rag_opt-0.1.6.tar.gz (55.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rag_opt-0.1.6-py3-none-any.whl (65.9 kB view details)

Uploaded Python 3

File details

Details for the file rag_opt-0.1.6.tar.gz.

File metadata

  • Download URL: rag_opt-0.1.6.tar.gz
  • Upload date:
  • Size: 55.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.26.0 CPython/3.13.8 Linux/6.14.0-1012-azure

File hashes

Hashes for rag_opt-0.1.6.tar.gz
Algorithm Hash digest
SHA256 fe2f0dfcd76fe4e501cebaa7221965d348dfa42721a5bbd83081c109220d81cc
MD5 4729623a276615c8eb45763afa83309c
BLAKE2b-256 ec5c6bfc0f8f694b3c255ccd466af23a8ad11cad21a7071fff4a8cad64e7feed

See more details on using hashes here.

File details

Details for the file rag_opt-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: rag_opt-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 65.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.26.0 CPython/3.13.8 Linux/6.14.0-1012-azure

File hashes

Hashes for rag_opt-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 8d780a25ee40aa2019d102bf537ac2c2f006e07ed97221454ab696ec9305318a
MD5 479cfa88f2325b20594dce1d4ebd09e9
BLAKE2b-256 00503322720b7052ca7bdf305dac120990fd7a9680a64a5136ada455022b29ef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page