Bayesian Optimization as a Coverage Tool for Evaluating Large Language Models
Project description
☂️ BoCoEL
Bayesian Optimization as a Coverage Tool for Evaluating Large Language Models
🤔 Why BoCoEL?
Large language models are expensive and slow behemoths, and evaluating them on gigantic modern datasets only makes it worse.
If only there is a way to just select a meaningful (and small) subset of the corpus and obtain a highly accurate evaluation.....
Wait, sounds like Bayesian Optmization!
Bocoel works in the following steps:
- Encode individual entry into embeddings (way cheaper / faster than LLM and reusable).
- Use Bayesian optimization to select queries to evaluate.
- Use the queries to retrieve from our corpus (with the encoded embeddings).
- Profit.
The evaluations generated are easily managed by the provided manager utility.
🚀 Features
- 🎯 Accurately evaluate large language models with just tens of samples from your selected corpus.
- 💂♂️ Uses the power of Bayesian optimization to select an optimal set of samples for language model to evaluate.
- 💯 Evalutes the corpus on the model in addition to evaluating the model on corpus.
- 🤗 Integration with huggingface transformers and datasets
- 🧩 Modular design.
🗺️ Roadmap: work in progress
- 📊 Visualization module of the evaluation.
- 🎲 Integration of alternative methods (random, kmedoids...) with Gaussian process.
- 🥨 Integration with more backends such as VLLM and OpenAI's API.
⭐ Give us a star!
Like what you see? Please consider giving this a star (★)!
♾️ Bayesian Optimization
Simply put, Bayesian optimization aims to optimize either the exploration objective (the purple area in the image) or the exploitation object (the height of the black dots). It uses Gaussian processes as a backbone for inference, and uses an acquisition function to decide where to sample next. See here for an a more in-depth introduction.
Since Bayesian optimization works well with expensive-to-evaluate black-box model (paraphrase: LLM), it is perfect for this particular use case. Bocoel uses Bayesian optimization as a backbone for exploring the embedding space given by our corpus, which allows it to select a good subset acting as a mini snapshot of the corpus.
⬇️ Installation
I don't want optional dependencies:
pip install bocoel
Give me the full experience (all optional dependencies):
pip install "bocoel[all]"
🥰 Contributing
Openness and inclusiveness are taken very seriously. Please follow the guide to contributing and the code of conduct.
🏷️ License and Citation
The code is available under Apache License.
TODO: Citation
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.