Skip to main content

RAG workflow Optimizer based on Bayesian Optimization

Project description

RAGOpt Logo

RAGOpt eliminates manual hyperparameter tuning in your RAG pipelines with Bayesian optimization 🚀

Publish passing PyPI version Python 3.11 MIT License



RAGOpt is a Python framework to optimize Retrieval-Augmented Generation (RAG) pipelines. It eliminates manual hyperparameter tuning using Bayesian optimization, automatically finding the best configuration for your dataset and use case.

Key Features

  • Optimizes 20+ RAG hyperparameters including chunk size, overlap, embedding strategies, and LLM selection.
  • Flexible with any LangChain-compatible model or provider.
  • Partially Opinionated - Smart defaults with full flexibility for customization
  • Generates Pareto-optimal configurations for your specific data.
  • Comprehensive Metrics - Quality (precision, recall, faithfulness), performance (latency, cost), and safety (toxicity, bias)

Installation

pip install rag-opt

🔥 Quick Start

1. Generate Training Questions

from langchain.chat_models import init_chat_model
from rag_opt.rag import DatasetGenerator

# Initialize LLM
llm = init_chat_model(
    model="gpt-3.5-turbo",
    model_provider="openai",
    api_key="sk-***"
)

# Generate Q&A pairs from your documents
data_gen = DatasetGenerator(llm, dataset_path="./data")
dataset = data_gen.generate(10)
dataset.to_json("./rag_dataset.json")

2. Define Your Search Space

Create rag_config.yaml:

chunk_size:
  bounds: [512, 1024]
  dtype: int

max_tokens:
  bounds: [256, 512]
  dtype: int

chunk_overlap:
  bounds: [0, 200]
  dtype: int

temperature:
  bounds: [0.0, 1.0]
  dtype: float

search_type:
  choices: ["similarity", "mmr", "hybrid"]

vector_store:
  choices:
    faiss: {}
    pinecone:
      api_key: "YOUR_API_KEY"
      index_name: "your-index"

embedding:
  choices:
    openai:
      api_key: "YOUR_API_KEY"
      models:
        - "text-embedding-3-large"
        - "text-embedding-ada-002"
    huggingface:
      models:
        - "all-MiniLM-L6-v2"

llm:
  choices:
    openai:
      api_key: "YOUR_API_KEY"
      models:
        - "gpt-4o"
        - "gpt-3.5-turbo"

k:
  bounds: [1, 10]
  dtype: int

use_reranker: false

3. Run Optimization

from rag_opt.dataset import TrainDataset
from rag_opt.optimizer import Optimizer

# Load dataset
train_dataset = TrainDataset.from_json("rag_dataset.json")

# Initialize optimizer
optimizer = Optimizer(
    train_dataset=train_dataset,
    config_path="rag_config.yaml",
    verbose=True
)

# Find optimal configuration
best_config = optimizer.optimize(n_trials=3, best_one=True)
best_config.to_json()

4. Get Your Optimized Config

Output example:

{
"chunk_size": 500
"max_tokens": 100
"chunk_overlap": 200
"search_type": "hybrid"
"k": "1"
"temperature": 1.0
"embedding":
  "provider": "openai"
  "model": "text-embedding-3-large"
"llm":
  "provider": "openai"
  "model": "gpt-4o"
"vector_store":
  "provider": "faiss"
"use_reranker": "true"
}

How It Works

  1. Dataset Generation - Create synthetic Q&A pairs from your documents
  2. Search Space Definition - Configure which parameters to optimize
  3. Bayesian Optimization - Intelligently sample and evaluate configurations
  4. Multi-Metric Evaluation - Assess quality, performance, and safety
  5. Pareto-Optimal Results - Get the best configurations for your priorities

RAGOpt vs Alternatives

  • AutoRAG: Uses Bayesian optimization instead of grid search like AutoRAG
  • Ragas: Flexible evaluation framework, not rigid—bring your own metrics
  • Manual Tuning: Systematic, data-driven approach saves time and improves results

License

This project is licensed under the terms of the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rag_opt-0.11.1.tar.gz (60.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rag_opt-0.11.1-py3-none-any.whl (71.4 kB view details)

Uploaded Python 3

File details

Details for the file rag_opt-0.11.1.tar.gz.

File metadata

  • Download URL: rag_opt-0.11.1.tar.gz
  • Upload date:
  • Size: 60.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.26.3.dev5+g0a2e7fd4 CPython/3.12.12 Linux/6.11.0-1018-azure

File hashes

Hashes for rag_opt-0.11.1.tar.gz
Algorithm Hash digest
SHA256 be98dfe0364c70e64403932b3338d4cefdb2fc6080cd0314048aa7f07ce3683f
MD5 b2f9ac4efbebfb90d54d9b849f06d211
BLAKE2b-256 db104c18035beaa1dc23b9824915b12b3ab16455b3bb416aae6c206acb27516b

See more details on using hashes here.

File details

Details for the file rag_opt-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: rag_opt-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 71.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.26.3.dev5+g0a2e7fd4 CPython/3.12.12 Linux/6.11.0-1018-azure

File hashes

Hashes for rag_opt-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 055457adafd7e2c63d864921e236d3eefc1bde420bc708118fa144ad79cb0a4c
MD5 45adac3189b2af7305d425c89c841f3c
BLAKE2b-256 9c88ae41b1fd3b8e1d7b04a8071c99bf3d3bf8870c3a17ee894c1f8da259a1e7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page