A Tree Search Library with Flexible API for LLM Inference-Time Scaling
Project description
TreeQuest
A flexible answer tree search library featuring AB-MCTS, useful for (but not limited to) LLM inference-time scaling.
Quick Start
import random
import treequest as tq
# Each node is associated with a user-definable `state`.
State = str
# 1. Define a function to be used for node generation.
def generate(parent_state: State | None) -> tuple[State, float]:
"""Generates new states and scores based on the parent state."""
if parent_state is None: # None represents the expansion from root.
new_state = "Initial state"
else:
new_state = f"State after {parent_state}"
score = random.random() # A score for the new state; It should be normalized to the [0, 1] range.
return new_state, score
# 2. Instantiate the algorithm and a search tree object.
algo = tq.ABMCTSA()
search_tree = algo.init_tree()
# 3. Run the search with a generation budget (10 in this case).
for _ in range(10):
search_tree = algo.step(search_tree, {'Action A': generate})
# 4. Extract the best score and state.
best_state, best_node_score = tq.top_k(search_tree, algo, k=1)[0]
print(f"Best state: {best_state}, Score: {best_node_score}")
Features
- Easy-to-use API with customizable node generation and node scoring logic.
- AB-MCTS-A and AB-MCTS-M, as well as Multi-LLM AB-MCTS support (See our paper for algorithm details).
- Checkpointing and resuming searches.
Installation
uv
First, install uv. Then you can install TreeQuest with the following command:
uv add "treequest[abmcts-m]"
pip
Alternatively, you can use pip to install TreeQuest:
pip install "treequest[abmcts-m]"
Usage
Using an LLM as a Node Generator
You can use any object as a node state. You only need to define a generating function that returns a (state, score) tuple and takes the parent state as an argument:
import dataclasses
import treequest as tq
@dataclasses.dataclass
class State:
llm_answer: str
score: float
def generate(parent_state: State | None) -> tuple[State, float]:
"""Generate a new node by calling an LLM."""
if parent_state is None:
state = initial_generation()
else:
state = refine_answer(parent_state.llm_answer, parent_state.score)
return state, state.score
def initial_generation() -> State:
"""
Call LLM API to generate an initial answer.
"""
...
def refine_answer(llm_answer: str, score: float) -> State:
"""
Call LLM API to refine an answer.
"""
...
algo = tq.ABMCTSM()
search_tree = algo.init_tree()
for i in range(20):
search_tree = algo.step(search_tree, {'Action Label': generate})
# Logging best node during the search.
if (i + 1) % 5 == 0:
best_interim_state, _ = tq.top_k(search_tree, algo, k=1)[0]
print(f"Iteration {i+1}: Best state so far = {best_interim_state}")
best_state, _ = tq.top_k(search_tree, algo, k=1)[0]
print(f"Best Answer: {best_state.llm_answer}, Best Score: {best_state.score}")
Using Multiple LLMs (and Beyond)
TreeQuest supports multiple action types. For example, you can provide multiple generation functions backed by different LLMs to represent different action types:
from functools import partial
import treequest as tq
def generate(llm_name: str, parent_state=None):
"""
Call LLM API using litellm, vllm, etc., to generate a new node
"""
...
return new_state, new_score
llm_names = ["o4-mini", "gemini-2.5-pro"]
# Create dict of different actions backed by different LLMs.
generate_fns = {llm_name: partial(generate, llm_name=llm_name) for llm_name in llm_names}
algo = tq.StandardMCTS()
search_tree = algo.init_tree()
for _ in range(20):
search_tree = algo.step(search_tree, generate_fns)
The variation is not limited to LLM types; you can use different prompts, actions, scoring logic, etc. in generate_fns.
Algorithms
ABMCTS-A: ABMCTS with Node Aggregation
ABMCTS-A uses node aggregation for adaptive branching:
import treequest as tq
# Instantiate the ABMCTS-A algorithm.
ab_mcts_a = tq.ABMCTSA()
search_tree = ab_mcts_a.init_tree()
for _ in range(50):
search_tree = ab_mcts_a.step(search_tree, generate_fns)
ABMCTS-M: ABMCTS with Mixed Models
ABMCTS-M leverages PyMC's mixed modeling capabilities:
import treequest as tq
# Instantiate the ABMCTS-M algorithm.
ab_mcts_m = tq.ABMCTSM()
search_tree = ab_mcts_m.init_tree()
for _ in range(30):
search_tree = ab_mcts_m.step(search_tree, generate_fns)
NOTE: To run AB-MCTS-M, you need to install extra dependencies with the treequest[abmcts-m] option.
Requirements
- Python 3.11+
Contributing
Contributions are welcome! Please see CONTRIBUTING.md for development tips.
Citation
@article{inoue2025wider,
title={Wider or Deeper? Scaling LLM Inference-Time Compute with Adaptive Branching Tree Search},
author={Inoue, Yuichi and Misaki, Kou and Imajuku, Yuki and Kuroki, So and Nakamura, Taishi and Akiba, Takuya},
journal={arXiv preprint arXiv:2503.04412},
year={2025}
}
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file treequest-0.1.2.tar.gz.
File metadata
- Download URL: treequest-0.1.2.tar.gz
- Upload date:
- Size: 564.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.8.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6dced1029bfdfc9024729a8871dde993e8ca9ac93381a4a9a73a06818671fd42
|
|
| MD5 |
c3c25060f111c7d7fe0bf569b22cbb0a
|
|
| BLAKE2b-256 |
7d07f91aacc010949e2efb1387261c9e1397903a814ef74b987b59387e1f4484
|
File details
Details for the file treequest-0.1.2-py3-none-any.whl.
File metadata
- Download URL: treequest-0.1.2-py3-none-any.whl
- Upload date:
- Size: 40.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.8.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b6d0a4d2c96a0c1d4fc7e90908fbbd4353bb45ea8d418832819db7aaf5f11f31
|
|
| MD5 |
2ef53dfa0c5f0c004f9450af416ef580
|
|
| BLAKE2b-256 |
203b426e7a8c497a74724c9d8148baeafa9fe7f5f3fa2a74511c9eca0161b7a6
|