Skip to main content

BOAT: A Compositional Operation Toolbox for Gradient-based Bi-Level Optimization

Project description

BOAT

A Compositional Operation Toolbox for Gradient-based Bi-Level Optimization

Home | Installation | Docs | Tutorials | Examples

PyPI version GitHub Actions Workflow Status codecov pages-build-deployment GitHub commit activity GitHub top language GitHub language count Python version license Code style: black

BOAT is a compositional OperAtion-level Toolbox for gradient-based BLO.

Unlike existing libraries that typically encapsulate fixed solver routines, BOAT factorizes the BLO workflow into atomic, reusable primitives. Through a unified constraint reconstruction perspective, it empowers researchers to automatically compose over 85+ solver variants from a compact set of 17 gradient operations.

This is the Jittor-based version of BOAT (boat-jit). It leverages Jittor’s Just-In-Time (JIT) compilation and efficient CUDA/cuDNN backends to significantly accelerate large-scale gradient-based BLO experiments.

BOAT supports unified execution across backends via separate branches:

BOAT Structure

🔑 Key Features

  • 🧩 Compositional Operation-Level Abstraction: Deconstructs BLO solvers into three modular stages: Gradient Mapping (GM), Numerical Approximation (NA), and First-Order (FO).
  • ⚡ Accelerated JIT Execution: Built on Jittor, enabling meta-operator fusion and high-performance execution on NVIDIA GPUs.
  • 🏭 Generative Solver Construction: Supports dynamic serialization of operations. Users can recover classical algorithms or discover novel hybrid solvers simply by changing configurations.
  • 🛠 Configuration-Driven: Define complex optimization strategies via simple JSON configurations, decoupling algorithmic logic from model definitions.
  • ✅ Comprehensive Testing: Achieves 99% code coverage through rigorous testing with pytest, ensuring software robustness.

📚 Supported Operation Libraries

BOAT implements 17 atomic gradient operations organized into three modular libraries. These primitives can be dynamically serialized to generate over 85+ solver variants, covering the full spectrum of BLO methodologies.

Library Functional Role Supported Atomic Operations
GM-OL
(Gradient Mapping)
Reconstructs the LL iterative trajectory.
Customizes the dynamic mapping rules ($\mathcal{T}_k$) to shape the optimization path and variable coupling.
NGD (Naive Gradient Descent)
GDA (Gradient Descent Aggregation)
DI (Dynamic Initialization)
DM (Dual Multiplier / KKT)
NA-OL
(Numerical Approx.)
Resolves the auxiliary gradient bottleneck.
Approximates the implicit gradients or hyper-gradients via automatic differentiation, numerical inversion, or truncation.
RAD (Reverse-AD / Unrolled)
RGT (Reverse Gradient Truncation)
PTT (Pessimistic Trajectory Truncation)
FD (Finite Difference / DARTS)
CG (Conjugate Gradient)
NS (Neumann Series)
IGA (Implicit Gradient Approximation)
IAD (Init-based AD / MAML)
FOA (First-Order Approx. / Reptile)
FO-OL
(First-Order)
Constructs single-level surrogates.
Reformulates the nested problem into first-order objectives using value-functions or penalties, avoiding Hessian computations.
VSO (Value-Function Sequential)
VFO (Value-Function First-Order)
MESO (Moreau Envelope)
PGDO (Penalty Gradient Descent)

🔨 Installation

BOAT-jit is built on top of Jittor. Please ensure Jittor is installed correctly before installing BOAT.

1. Install Jittor

Follow the Official Installation Guide or use the commands below:

Linux (Ubuntu / CentOS)

sudo apt install python3.8-dev libomp-dev
python3.8 -m pip install jittor
# Verify installation (Optional)
python3.8 -m jittor.test.test_example

Windows

python -m pip install jittor
python -m jittor.test.test_core

macOS

brew install libomp
python3.8 -m pip install jittor

2. Install BOAT-jit

Once Jittor is ready, install BOAT-jit via PyPI or Source:

# Install from PyPI
pip install boat-jit

# Or install from Source (Specific Branch)
git clone -b boat_jit --single-branch [https://github.com/callous-youth/BOAT.git](https://github.com/callous-youth/BOAT.git)
cd BOAT
pip install -e .

⚡ How to Use BOAT

BOAT separates the problem definition from the solver configuration. Below is a Jittor-based example.

1. Load Configurations

import json
import boat_jit as boat

# Load algorithmic configurations
with open("configs/boat_config.json", "r") as f:
    boat_config = json.load(f)

# Load objective configurations
with open("configs/loss_config.json", "r") as f:
    loss_config = json.load(f)

2. Define Models and Optimizers

Use standard Jittor models and optimizers.

import jittor as jit

# Define models
upper_model = MyUpperModel()
lower_model = MyLowerModel()

# Define optimizers (Jittor syntax)
upper_opt = jit.nn.Adam(upper_model.parameters(), lr=1e-3)
lower_opt = jit.nn.SGD(lower_model.parameters(), lr=1e-2)

3. Customize & Initialize Problem

Inject runtime objects into the configuration.

# Configure BOAT with Jittor models/optimizers
boat_config["lower_level_model"] = lower_model
boat_config["upper_level_model"] = upper_model
boat_config["lower_level_opt"] = lower_opt
boat_config["upper_level_opt"] = upper_opt

# Initialize the BOAT core
b_optimizer = boat.Problem(boat_config, loss_config)

4. Build Solvers

Inject runtime objects into the configuration.

# Pass optimizers explicitly if needed by the solver backend
b_optimizer.build_ll_solver(lower_opt)
b_optimizer.build_ul_solver(upper_opt)

5. Run Optimization Loop

# Training loop
for x_itr in range(1000):
    # Prepare data (Jittor Arrays or dicts)
    ul_feed_dict = {"data": ul_data, "target": ul_target}
    ll_feed_dict = {"data": ll_data, "target": ll_target}
    
    # Run one iteration
    loss, run_time = b_optimizer.run_iter(ll_feed_dict, ul_feed_dict, current_iter=x_itr)
    
    if x_itr % 100 == 0:
        print(f"Iter {x_itr}: UL Loss {loss:.4f}")

🌍 Applications

BOAT covers a wide spectrum of BLO applications, categorized by the optimization target:

  • Data-Centric: Data Hyper-Cleaning, Synthetic Data Reweighting, Diffusion Model Guidance.
  • Model-Centric: Neural Architecture Search (NAS), LLM Prompt Optimization, Parameter Efficient Fine-Tuning (PEFT).
  • Strategy-Centric: Meta-Learning, Hyperparameter Optimization (HO), Reinforcement Learning from Human Feedback (RLHF).

📝 Citation

If you find BOAT useful in your research, please consider citing our paper:

@article{liu2025boat,
  title={BOAT: A Compositional Operation Toolbox for Gradient-based Bi-Level Optimization},
  author={Liu, Yaohua and Pan, Jibao and Jiao, Xianghao and Gao, Jiaxin and Liu, Zhu and Liu, Risheng},
  journal={Submitted to Journal of Machine Learning Research (JMLR)},
  year={2025}
}

License

MIT License

Copyright (c) 2024 Yaohua Liu

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

boat_jit-1.0.5.tar.gz (51.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

boat_jit-1.0.5-py3-none-any.whl (70.5 kB view details)

Uploaded Python 3

File details

Details for the file boat_jit-1.0.5.tar.gz.

File metadata

  • Download URL: boat_jit-1.0.5.tar.gz
  • Upload date:
  • Size: 51.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for boat_jit-1.0.5.tar.gz
Algorithm Hash digest
SHA256 fb1e85f33502cdc9a863c30b1911234e401d28823647dbc3a9d8f9a3e2cd8188
MD5 b355ba5a9c4a684d579f25ef31893e9a
BLAKE2b-256 2ffa209e84a2cb381408940fe93368f7398a0234c8fa3099e19e68f71504f7f5

See more details on using hashes here.

File details

Details for the file boat_jit-1.0.5-py3-none-any.whl.

File metadata

  • Download URL: boat_jit-1.0.5-py3-none-any.whl
  • Upload date:
  • Size: 70.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for boat_jit-1.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 0ee1c8a84874095e148610eedccd7ff43aa4955e1f7536d67a1d85d623c9417d
MD5 b35c5622fe9ab69d25f334708f8c1f91
BLAKE2b-256 18299dd54daaae7965dcc5aabe9fc0e4e9b0f69878cb8dffc3a41224f4023514

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page