Skip to main content

RAG workflow Optimizer based on Bayesian Optimization

Project description

RAGOpt Logo

RAGOpt eliminates manual hyperparameter tuning in your RAG pipelines with Bayesian optimization 🚀

Publish passing PyPI version Python 3.11 MIT License



RAGOpt is a Python framework to optimize Retrieval-Augmented Generation (RAG) pipelines. It eliminates manual hyperparameter tuning using Bayesian optimization, automatically finding the best configuration for your dataset and use case.

Key Features

  • Optimizes 20+ RAG hyperparameters including chunk size, overlap, embedding strategies, and LLM selection.
  • Flexible with any LangChain-compatible model or provider.
  • Partially Opinionated - Smart defaults with full flexibility for customization
  • Generates Pareto-optimal configurations for your specific data.
  • Comprehensive Metrics - Quality (precision, recall, faithfulness), performance (latency, cost), and safety (toxicity, bias)

Installation

pip install rag-opt

🔥 Quick Start

1. Generate Training Questions

from langchain.chat_models import init_chat_model
from rag_opt.rag import DatasetGenerator

# Initialize LLM
llm = init_chat_model(
    model="gpt-3.5-turbo",
    model_provider="openai",
    api_key="sk-***"
)

# Generate Q&A pairs from your documents
data_gen = DatasetGenerator(llm, dataset_path="./data")
dataset = data_gen.generate(10)
dataset.to_json("./rag_dataset.json")

2. Define Your Search Space

Create rag_config.yaml:

chunk_size:
  bounds: [512, 1024]
  dtype: int

max_tokens:
  bounds: [256, 512]
  dtype: int

chunk_overlap:
  bounds: [0, 200]
  dtype: int

temperature:
  bounds: [0.0, 1.0]
  dtype: float

search_type:
  choices: ["similarity", "mmr", "hybrid"]

vector_store:
  choices:
    faiss: {}
    pinecone:
      api_key: "YOUR_API_KEY"
      index_name: "your-index"

embedding:
  choices:
    openai:
      api_key: "YOUR_API_KEY"
      models:
        - "text-embedding-3-large"
        - "text-embedding-ada-002"
    huggingface:
      models:
        - "all-MiniLM-L6-v2"

llm:
  choices:
    openai:
      api_key: "YOUR_API_KEY"
      models:
        - "gpt-4o"
        - "gpt-3.5-turbo"

k:
  bounds: [1, 10]
  dtype: int

use_reranker: false

3. Run Optimization

from rag_opt.dataset import TrainDataset
from rag_opt.optimizer import Optimizer

# Load dataset
train_dataset = TrainDataset.from_json("rag_dataset.json")

# Initialize optimizer
optimizer = Optimizer(
    train_dataset=train_dataset,
    config_path="rag_config.yaml",
    verbose=True
)

# Find optimal configuration
best_config = optimizer.optimize(n_trials=3, best_one=True)
best_config.to_json()

4. Get Your Optimized Config

Output example:

{
"chunk_size": 500
"max_tokens": 100
"chunk_overlap": 200
"search_type": "hybrid"
"k": "1"
"temperature": 1.0
"embedding":
  "provider": "openai"
  "model": "text-embedding-3-large"
"llm":
  "provider": "openai"
  "model": "gpt-4o"
"vector_store":
  "provider": "faiss"
"use_reranker": "true"
}

How It Works

  1. Dataset Generation - Create synthetic Q&A pairs from your documents
  2. Search Space Definition - Configure which parameters to optimize
  3. Bayesian Optimization - Intelligently sample and evaluate configurations
  4. Multi-Metric Evaluation - Assess quality, performance, and safety
  5. Pareto-Optimal Results - Get the best configurations for your priorities

RAGOpt vs Alternatives

  • AutoRAG: Uses Bayesian optimization instead of grid search like AutoRAG
  • Ragas: Flexible evaluation framework, not rigid—bring your own metrics
  • Manual Tuning: Systematic, data-driven approach saves time and improves results

License

This project is licensed under the terms of the MIT license.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rag_opt-0.11.0.tar.gz (60.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rag_opt-0.11.0-py3-none-any.whl (70.6 kB view details)

Uploaded Python 3

File details

Details for the file rag_opt-0.11.0.tar.gz.

File metadata

  • Download URL: rag_opt-0.11.0.tar.gz
  • Upload date:
  • Size: 60.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.26.3.dev4+gdaa938c3 CPython/3.12.12 Linux/6.11.0-1018-azure

File hashes

Hashes for rag_opt-0.11.0.tar.gz
Algorithm Hash digest
SHA256 aeb0b3b0317514121bfd35e22aa3ff4053046c44ffd654affdc62cb930b34076
MD5 d329e7e6332ade547c95b7eba582d4ee
BLAKE2b-256 3657a526967e1bd6ce2539f883564db2683e1d13bf164aa7c56b74f6302344fc

See more details on using hashes here.

File details

Details for the file rag_opt-0.11.0-py3-none-any.whl.

File metadata

  • Download URL: rag_opt-0.11.0-py3-none-any.whl
  • Upload date:
  • Size: 70.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.26.3.dev4+gdaa938c3 CPython/3.12.12 Linux/6.11.0-1018-azure

File hashes

Hashes for rag_opt-0.11.0-py3-none-any.whl
Algorithm Hash digest
SHA256 62468e9ac693f6963b0ec7ece32ad9db01defa840fee4de2c16ce5c3d0bc5eee
MD5 2b9cdf8a409f472deef14faedb4da85a
BLAKE2b-256 5d80d4f704b59615254b06d4c2cd67c0384c54a55850699a2c99e866fa013fde

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page