No project description provided
Project description
AutoArena
Create leaderboards ranking LLM outputs against one another using automated judge evaluation
- 🏆 Rank outputs from different LLMs, RAG setups, and prompts to find the best configuration of your system
- ⚔️ Perform automated head-to-head evaluation using judges from OpenAI, Anthropic, Cohere, and more
- 🤖 Define and run your own custom judges, connecting to internal services or implementing bespoke logic
- 💻 Run application locally, getting full control over your environment and data
🔥 Getting Started
Install from PyPI:
pip install autoarena
Run as a module and visit localhost:8899 in your browser:
python -m autoarena
With the application running, getting started is simple:
- Create a project via the UI.
- Add responses from a model by selecting a CSV file with
prompt
andresponse
columns. - Configure an automated judge via the UI. Note that most judges require credentials, e.g.
X_API_KEY
in the environment where you're running AutoArena. - Add responses from a second model to kick off an automated judging task using the judges you configured in the
previous step to decide which of the models you've uploaded provided a better
response
to a givenprompt
.
That's it! After these steps you're fully set up for automated evaluation on AutoArena.
📄 Formatting Your Data
AutoArena requires two pieces of information to test a model: the input prompt
and corresponding model response
.
prompt
: the inputs to your model. When uploading responses, any other models that have been run on the same prompts are matched and evaluated using the automated judges you have configured.response
: the output from your model. Judges decide which of two models produced a better response, given the same prompt.
📂 Data Storage
Data is stored in ./data/<project>.duckdb
files in the directory where you invoked AutoArena. See
data/README.md
for more details on data storage in AutoArena.
🦾 Development
AutoArena uses uv to manage dependencies. To set up this repository for development, run:
uv venv && source .venv/bin/activate
uv pip install --all-extras -r pyproject.toml
uv tool run pre-commit install
uv run python3 -m autoarena --dev
To run AutoArena for development, you will need to run both the backend and frontend service:
- Backend:
uv run python3 -m autoarena --dev
(the--dev
/-d
flag enables automatic service reloading when source files change) - Frontend: see
ui/README.md
To build a release tarball in the ./dist
directory:
./scripts/build.sh
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.