Adaptive Top-2 Bounding-Box Hyperparameter Tuner
Project description
Bounding-Box Tuner (BBT)
Adaptive Top-2 Bounding-Box Hyperparameter Tuner
Lightweight hyperparameter search with median-based early pruning.
Bounding-Box Tuner (BBT) implements a simple but effective strategy to balance exploration and exploitation:
- Top-2 bounding box sampling: At each iteration, define a hyper-rectangular “box” around your two best configurations and sample new candidates inside it.
- Adaptive exploration schedule: Start with 35 % global random sampling, then decay to 10 % to focus on the promising region. (It can be adjusted)
- Median-based partial-training pruning: After every epoch e, prune any trial whose validation score falls below the median of all previous epoch-e scores.
- Optional parallel evaluation via
n_workers.
🚀 Features
- Easy integration: Plug BBT into any training loop via a one-line objective function
- Fast partial training: Each trial runs for up to
max_epochsepochs, but can be cut short if it underperforms - Minimal dependencies: Only standard Python libraries (
numpy,scipy,statistics) - Configurable: Control total trials (
max_trials), warm-up samples (init_samples), early stops, and more - Parallel evaluation: run up to
n_workerstrials concurrently - Configurable exploration: set your own
explore_rate_start&explore_rate_end
🔧 Installation
Install directly from GitHub:
git clone https://github.com/abdulvahapmutlu/bounding-box-tuner-bbt.git
cd bounding-box-tuner-bbt
pip install -r requirements.txt
⚡ Quickstart
from bbt.tuner import adaptive_top2_box_tuner
from bbt.utils import param_space_dict
def objective_fn(params, train_ds, val_ds, epoch):
"""
Train one epoch with `params` on train_ds,
evaluate on val_ds, and return a scalar metric.
"""
# YOUR TRAINING LOOP HERE
return val_accuracy
best_params, best_score, trials_log, elapsed = adaptive_top2_box_tuner(
train_dataset=my_train_ds,
val_dataset=my_val_ds,
param_space_dict=param_space_dict,
objective_fn=objective_fn,
max_trials=30,
init_samples=5,
early_stopping_rounds=10,
max_epochs=5,
explore_rate_start=0.35, # initial random-sampling probability
explore_rate_end=0.10, # final random-sampling probability
n_workers=0 # number of parallel workers
)
print("Best params:", best_params)
print("Best validation score:", best_score)
print(f"Ran {len(trials_log)} trials in {elapsed:.1f}s.")
📚 API Reference
adaptive_top2_box_tuner(...)
adaptive_top2_box_tuner(
train_dataset,
val_dataset,
param_space_dict: dict,
objective_fn: Callable[[dict, Any, Any, int], float],
max_trials: int = 50,
init_samples: int = 10,
early_stopping_rounds: int = 30,
max_epochs: int = 5,
explore_rate_start: float = 0.35,
explore_rate_end: float = 0.10,
n_workers: int = 1,
) -> Tuple[dict, float, List[dict], float]
-
Returns
-
best_params(dict): highest-scoring hyperparameter set -
best_score(float): corresponding validation metric -
trials_log(list of dict) -
total_time(float)"params": dict"score": float"pruned": bool"epoch_scores":{epoch: score}
-
total_time(float): elapsed seconds
-
-
Key args
param_space_dict: map each hyperparameter to its sampling info (min/max, type, etc.)objective_fn: called once per epoch—return validation scoremax_trials,init_samples,early_stopping_rounds,max_epochs: control search budget & pruningexplore_rate_start: starting probability of global random samplingexplore_rate_end: ending probability of global random samplingn_workers: number of parallel processes (uses multiprocessing.Pool)
See bbt/utils.py for helpers:
sample_valid_params(param_space_dict)sample_valid_params_bounding_box(box)
🙌 Contributing
- Fork the repo
- Create a branch:
git checkout -b feature/YourFeature - Write code & tests
- Open a Pull Request
Please follow PEP 8 and include tests for new functionality.
📄 License
This project is licensed under the MIT License. See LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bbt_tuner-0.1.0.tar.gz.
File metadata
- Download URL: bbt_tuner-0.1.0.tar.gz
- Upload date:
- Size: 6.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
843a35f8c9ba67e0722a4c29322e030f842ffe1e880f355c3907509f32299bee
|
|
| MD5 |
fb4cee8aa6060c51805ac4dc2909fd48
|
|
| BLAKE2b-256 |
1b1fc9868bccc534373a3e4c671f50006125f9f3a4a98e16d2ac1a2f71e6d38a
|
File details
Details for the file bbt_tuner-0.1.0-py3-none-any.whl.
File metadata
- Download URL: bbt_tuner-0.1.0-py3-none-any.whl
- Upload date:
- Size: 6.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1687cd604b628087519d4566732826ab4599fc78922ee73ea0b4e0335431c564
|
|
| MD5 |
7e8bd2175ae9492060b56e4498b07ff0
|
|
| BLAKE2b-256 |
10b0fb04b6feee77a27b354503fb52816c971c61809f97be6091a59f6d5c0aa7
|