Skip to main content

Bayesian Optimization as a Coverage Tool for Evaluating Large Language Models

Project description

☂️ BoCoEL

Bayesian Optimization as a Coverage Tool for Evaluating Large Language Models

Logo

Publish Build Pages Formatting Type Checking Unit Testing

GitHub License Python 3.12

🤔 Why BoCoEL?

Large language models are expensive and slow behemoths, and evaluating them on gigantic modern datasets only makes it worse.

If only there is a way to just select a meaningful (and small) subset of the corpus and obtain a highly accurate evaluation.....

Wait, sounds like Bayesian Optimization!

Bocoel works in the following steps:

  1. Encode individual entry into embeddings (way cheaper / faster than LLM and reusable).
  2. Use Bayesian optimization to select queries to evaluate.
  3. Use the queries to retrieve from our corpus (with the encoded embeddings).
  4. Profit.

The evaluations generated are easily managed by the provided manager utility.

To our knowledge, this is the first work aiming to reduce computation costs during evaluation (benchmarking) with a (possibly dynamic) budget.

🚀 Features

  • 🎯 Accurately evaluate large language models with just tens of samples from your selected corpus.
  • 💂‍♂️ Uses the power of Bayesian optimization to select an optimal subset of samples for the language model to evaluate.
  • 💯 Evaluate the corpus on the model in addition to evaluating the model on the corpus.
  • 🤗 Support for GPT2, Pythia, LLAMA and more through integration with huggingface transformers and datasets
  • 🧩 Modular design.
  • 🔎 Efficient representation of the corpus / dataset such as N-sphere representation or whitening of the latent space to augment evaluation quality.

⭐ Give us a star!

Like what you see? Please consider giving this a star (★)!

♾️ Bayesian Optimization

Simply put, Bayesian optimization aims to optimize either the exploration objective (the purple area in the image) or the exploitation object (the height of the black dots). It uses Gaussian processes as a backbone for inference, and uses an acquisition function to decide where to sample next. See here for an a more in-depth introduction.

Since Bayesian optimization works well with an expensive-to-evaluate black-box model (paraphrase: LLM), it is perfect for this particular use case. Bocoel uses Bayesian optimization as a backbone for exploring the embedding space given by our corpus, which allows it to select a good subset acting as a mini snapshot of the corpus.

🏎️ Performance Implications

LLMs are painfully slow, especially generative ones (which is what is usually referred to as LLM), since sequence generation is sequential by nature.

Despite bocoel's requirement to use an embedder to encode the entire corpus, embedders are faster than LLMs by orders of magnitude and the time is gained back by practically any savings in evaluating LLMs.

⬇️ Installation

I don't want optional dependencies:

pip install bocoel

Give me the full experience (all optional dependencies):

pip install "bocoel[all]"

🔬 Usage

See the folder examples/getting_started for a simplistic usage of the library to get started with just a few lines of code.

✍️ Develop with BoCoEL

Usage examples are under the folder examples. API reference can be found here.

🥰 Contributing

Contributors wanted! Don't be shy. Feel free to file issues and PRs. For PRs, please follow the guide on contributing and the code of conduct. Openness and inclusiveness are taken very seriously.

🗺️ Roadmap: work in progress

  • 🪑 Simpler usage. I should provide a high-level wrapper for the entire library s.t. evaluations can be run in one line.
  • 📊 Visualization module of the evaluation.
  • 🎲 Integration of alternative methods (random, kmedoids...) with Gaussian process.
  • 🥨 Integration with more backends such as VLLM and OpenAI's API.
  • 🆕 Support for Python 3.12+

🏷️ License and Citation

The code is available under BSD-3 License.

If you find this project helpful in your research, please cite this work at

@misc{bocoel2024,
    title = {BoCoEL: Bayesian Optimization as a Coverage Tool for Evaluating Large Language Models},
    url = {https://bocoel.rentruewang.com/research/},
    author = {Wang, RenChu},
    month = {January},
    year = {2024}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bocoel-0.1.3.post0.tar.gz (132.1 kB view details)

Uploaded Source

Built Distribution

bocoel-0.1.3.post0-py3-none-any.whl (89.2 kB view details)

Uploaded Python 3

File details

Details for the file bocoel-0.1.3.post0.tar.gz.

File metadata

  • Download URL: bocoel-0.1.3.post0.tar.gz
  • Upload date:
  • Size: 132.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.19.3 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for bocoel-0.1.3.post0.tar.gz
Algorithm Hash digest
SHA256 c813a60f8a87cfadba96b88fe349ec36a73f10186e1dcf6fb07b87a428569ca5
MD5 102f50a3b31794b69013725fef8b9fd5
BLAKE2b-256 7e8aee1fd80f3e4214a78eb6e6b2ca24460af1ae8e553d18b17432e7acfbb36e

See more details on using hashes here.

File details

Details for the file bocoel-0.1.3.post0-py3-none-any.whl.

File metadata

  • Download URL: bocoel-0.1.3.post0-py3-none-any.whl
  • Upload date:
  • Size: 89.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: pdm/2.19.3 CPython/3.12.7 Linux/6.5.0-1025-azure

File hashes

Hashes for bocoel-0.1.3.post0-py3-none-any.whl
Algorithm Hash digest
SHA256 c984c6893331432e514c853a955f952b339c53027563d14b112b8ce1b8c680b6
MD5 d72706d9e2bfbc01d05506cf6df9222a
BLAKE2b-256 70a3030e904423b606418ba000631b62083003ce0b9a2265bc93e74bc6724ba4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page