RAG workflow Optimizer based on Bayesian Optimization
Project description
RAGOpt eliminates manual hyperparameter tuning in your RAG pipelines with Bayesian optimization 🚀
- Documentation: https://ragopt.aboneda.com
- Colab Notebook: https://colab.research.google.com/drive/1hrfAHCfm3x0Ov-amCEHpptMiyoqC-McE
RAGOpt is a Python framework to optimize Retrieval-Augmented Generation (RAG) pipelines. It eliminates manual hyperparameter tuning using Bayesian optimization, automatically finding the best configuration for your dataset and use case.
Key Features
- Optimizes 20+ RAG hyperparameters including chunk size, overlap, embedding strategies, and LLM selection.
- Flexible with any LangChain-compatible model or provider.
- Partially Opinionated - Smart defaults with full flexibility for customization
- Generates Pareto-optimal configurations for your specific data.
- Comprehensive Metrics - Quality (precision, recall, faithfulness), performance (latency, cost), and safety (toxicity, bias)
Installation
pip install rag-opt
🔥 Quick Start
1. Generate Training Questions
from langchain.chat_models import init_chat_model
from rag_opt.rag import DatasetGenerator
# Initialize LLM
llm = init_chat_model(
model="gpt-3.5-turbo",
model_provider="openai",
api_key="sk-***"
)
# Generate Q&A pairs from your documents
data_gen = DatasetGenerator(llm, dataset_path="./data")
dataset = data_gen.generate(10)
dataset.to_json("./rag_dataset.json")
2. Define Your Search Space
Create rag_config.yaml:
chunk_size:
bounds: [512, 1024]
dtype: int
max_tokens:
bounds: [256, 512]
dtype: int
chunk_overlap:
bounds: [0, 200]
dtype: int
temperature:
bounds: [0.0, 1.0]
dtype: float
search_type:
choices: ["similarity", "mmr", "hybrid"]
vector_store:
choices:
faiss: {}
pinecone:
api_key: "YOUR_API_KEY"
index_name: "your-index"
embedding:
choices:
openai:
api_key: "YOUR_API_KEY"
models:
- "text-embedding-3-large"
- "text-embedding-ada-002"
huggingface:
models:
- "all-MiniLM-L6-v2"
llm:
choices:
openai:
api_key: "YOUR_API_KEY"
models:
- "gpt-4o"
- "gpt-3.5-turbo"
k:
bounds: [1, 10]
dtype: int
use_reranker: false
3. Run Optimization
from rag_opt.dataset import TrainDataset
from rag_opt.optimizer import Optimizer
# Load dataset
train_dataset = TrainDataset.from_json("rag_dataset.json")
# Initialize optimizer
optimizer = Optimizer(
train_dataset=train_dataset,
config_path="rag_config.yaml",
verbose=True
)
# Find optimal configuration
best_config = optimizer.optimize(n_trials=3, best_one=True)
best_config.to_json()
4. Get Your Optimized Config
Output example:
{
"chunk_size": 500
"max_tokens": 100
"chunk_overlap": 200
"search_type": "hybrid"
"k": "1"
"temperature": 1.0
"embedding":
"provider": "openai"
"model": "text-embedding-3-large"
"llm":
"provider": "openai"
"model": "gpt-4o"
"vector_store":
"provider": "faiss"
"use_reranker": "true"
}
How It Works
- Dataset Generation - Create synthetic Q&A pairs from your documents
- Search Space Definition - Configure which parameters to optimize
- Bayesian Optimization - Intelligently sample and evaluate configurations
- Multi-Metric Evaluation - Assess quality, performance, and safety
- Pareto-Optimal Results - Get the best configurations for your priorities
RAGOpt vs Alternatives
- AutoRAG: Uses Bayesian optimization instead of grid search like AutoRAG
- Ragas: Flexible evaluation framework, not rigid—bring your own metrics
- Manual Tuning: Systematic, data-driven approach saves time and improves results
License
This project is licensed under the terms of the MIT license.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file rag_opt-0.11.0.tar.gz.
File metadata
- Download URL: rag_opt-0.11.0.tar.gz
- Upload date:
- Size: 60.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: pdm/2.26.3.dev4+gdaa938c3 CPython/3.12.12 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aeb0b3b0317514121bfd35e22aa3ff4053046c44ffd654affdc62cb930b34076
|
|
| MD5 |
d329e7e6332ade547c95b7eba582d4ee
|
|
| BLAKE2b-256 |
3657a526967e1bd6ce2539f883564db2683e1d13bf164aa7c56b74f6302344fc
|
File details
Details for the file rag_opt-0.11.0-py3-none-any.whl.
File metadata
- Download URL: rag_opt-0.11.0-py3-none-any.whl
- Upload date:
- Size: 70.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: pdm/2.26.3.dev4+gdaa938c3 CPython/3.12.12 Linux/6.11.0-1018-azure
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
62468e9ac693f6963b0ec7ece32ad9db01defa840fee4de2c16ce5c3d0bc5eee
|
|
| MD5 |
2b9cdf8a409f472deef14faedb4da85a
|
|
| BLAKE2b-256 |
5d80d4f704b59615254b06d4c2cd67c0384c54a55850699a2c99e866fa013fde
|