Skip to main content

Problem-Agnostic Bilevel Optimization Toolkit in Python for Learning and Vision Tasks

Project description

BOAT

Task-Agnostic Operation Toolbox for Gradient-based Bilevel Optimization
Home | Installation | Docs | Tutorials | Examples |

PyPI version GitHub Actions Workflow Status codecov pages-build-deployment GitHub commit activity GitHub top language GitHub language count Python version license Code style: black

BOAT is a task-agnostic, gradient-based Bi-Level Optimization (BLO) Python library that focuses on abstracting the key BLO process into modular, flexible components. It enables researchers and developers to tackle learning tasks with hierarchical nested nature by providing customizable and diverse operator decomposition, encapsulation, and combination. BOAT supports specialized optimization strategies, including second-order or first-order, nested or non-nested, and with or without theoretical guarantees, catering to various levels of complexity.

To enhance flexibility and efficiency, BOAT incorporates the Dynamic Operation Library (D-OL) and the Hyper Operation Library (H-OL), alongside a collection of state-of-the-art first-order optimization strategies. BOAT also provides multiple implementation versions:

  • PyTorch-based: An efficient and widely-used version.
  • Jittor-based: An accelerated version for high-performance tasks.
  • MindSpore-based: Incorporating the latest first-order optimization strategies to support emerging application scenarios.

BOAT Structure

BOAT is designed to offer robust computational support for a broad spectrum of BLO research and applications, enabling innovation and efficiency in machine learning and computer vision.

🔑 Key Features

  • Dynamic Operation Library (D-OL): Incorporates 4 advanced dynamic system construction operations, enabling users to flexibly tailor optimization trajectories for BLO tasks.
  • Hyper-Gradient Operation Library (H-OL): Provides 9 refined operations for hyper-gradient computation, significantly enhancing the precision and efficiency of gradient-based BLO methods.
  • First-Order Gradient Methods (FOGMs): Integrates 4 state-of-the-art first-order methods, enabling fast prototyping and validation of new BLO algorithms. With modularized design, BOAT allows flexible combinations of multiple upper-level and lower-level operators, resulting in nearly 85 algorithmic combinations, offering unparalleled adaptability.
  • Modularized Design for Customization: Empowers users to flexibly combine dynamic and hyper-gradient operations while customizing the specific forms of problems, parameters, and optimizer choices, enabling seamless integration into diverse task-specific codes.
  • Comprehensive Testing & Continuous Integration: Achieves 99% code coverage through rigorous testing with pytest and Codecov, coupled with continuous integration via GitHub Actions, ensuring software robustness and reliability.
  • Fast Prototyping & Algorithm Validation: Streamlined support for defining, testing, and benchmarking new BLO algorithms.
  • Unified Computational Analysis: Offers a comprehensive complexity analysis of gradient-based BLO techniques to guide users in selecting optimal configurations for efficiency and accuracy.
  • Detailed Documentation & Community Support: Offers thorough documentation with practical examples and API references via MkDocs, ensuring accessibility and ease of use for both novice and advanced users.

🚀 Why BOAT?

Existing automatic differentiation (AD) tools primarily focus on specific optimization strategies, such as explicit or implicit methods, and are often targeted at meta-learning or specific application scenarios, lacking support for algorithm customization.

In contrast, BOAT expands the landscape of Bi-Level Optimization (BLO) applications by supporting a broader range of problem-adaptive operations. It bridges the gap between theoretical research and practical deployment, offering unparalleled flexibility to design, customize, and accelerate BLO techniques.

🏭 Applications

BOAT enables efficient implementation and adaptation of advanced BLO techniques for key applications, including but not limited to:

  • Hyperparameter Optimization (HO)
  • Neural Architecture Search (NAS)
  • Adversarial Training (AT)
  • Few-Shot Learning (FSL)
  • Generative Adversarial Learning
  • Transfer Attack
  • ...

🔨 Installation

To install BOAT, use the following command:

pip install boat-torch 
or run 
git clone https://github.com/callous-youth/BOAT.git
cd BOAT
pip install -e .

How to Use BOAT

1. Load Configuration Files

BOAT relies on two key configuration files:

  • boat_config.json: Specifies optimization strategies and dynamic/hyper-gradient operations.
  • loss_config.json: Defines the loss functions for both levels of the BLO process.
import os
import json
import boat_torch as torch

# Load configuration files
with open("path_to_configs/boat_config.json", "r") as f:
    boat_config = json.load(f)

with open("path_to_configs/loss_config.json", "r") as f:
    loss_config = json.load(f)

2. Define Models and Optimizers

You need to specify both the upper-level and lower-level models along with their respective optimizers.

import torch

# Define models
upper_model = UpperModel(*args, **kwargs)  # Replace with your upper-level model
lower_model = LowerModel(*args, **kwargs)  # Replace with your lower-level model

# Define optimizers
upper_opt = torch.optim.Adam(upper_model.parameters(), lr=0.01)
lower_opt = torch.optim.SGD(lower_model.parameters(), lr=0.01)

3. Customize BOAT Configuration

Modify the boat_config to include your dynamic and hyper-gradient methods, as well as model and variable details.

# Example dynamic and hyper-gradient methods Combination.
dynamic_method = ["NGD", "DI", "GDA"]  # Dynamic Methods (Demo Only)
hyper_method = ["RGT","RAD"]          # Hyper-Gradient Methods (Demo Only)

# Add methods and model details to the configuration
boat_config["dynamic_op"] = dynamic_method
boat_config["hyper_op"] = hyper_method
boat_config["lower_level_model"] = lower_model
boat_config["upper_level_model"] = upper_model
boat_config["lower_level_opt"] = lower_opt
boat_config["upper_level_opt"] = upper_opt
boat_config["lower_level_var"] = list(lower_model.parameters())
boat_config["upper_level_var"] = list(upper_model.parameters())

4. Initialize the BOAT Problem

Modify the boat_config to include your dynamic and hyper-gradient methods, as well as model and variable details.

# Initialize the problem
b_optimizer = boat.Problem(boat_config, loss_config)

# Build solvers for lower and upper levels
b_optimizer.build_ll_solver()  # Lower-level solver
b_optimizer.build_ul_solver()  # Upper-level solver

5. Define Data Feeds

Prepare the data feeds for both levels of the BLO process, which was further fed into the the upper-level and lower-level objective functions.

# Define data feeds (Demo Only)
ul_feed_dict = {"data": upper_level_data, "target": upper_level_target}
ll_feed_dict = {"data": lower_level_data, "target": lower_level_target}

6. Run the Optimization Loop

Execute the optimization loop, optionally customizing the solver strategy for dynamic methods.

# Set number of iterations
iterations = 1000

# Optimization loop (Demo Only)
for x_itr in range(iterations):
    # Run a single optimization iteration
    loss, run_time = b_optimizer.run_iter(ll_feed_dict, ul_feed_dict, current_iter=x_itr)

Related Methods

License

MIT License

Copyright (c) 2024 Yaohua Liu

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

boat_torch-1.0.tar.gz (35.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

boat_torch-1.0-py3-none-any.whl (52.6 kB view details)

Uploaded Python 3

File details

Details for the file boat_torch-1.0.tar.gz.

File metadata

  • Download URL: boat_torch-1.0.tar.gz
  • Upload date:
  • Size: 35.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.20

File hashes

Hashes for boat_torch-1.0.tar.gz
Algorithm Hash digest
SHA256 a38f91768ba9a4f50bc26342c2f13fac15dc776b21fe32bd0fb683189d50c260
MD5 ceae8559791e6e1736faec5a6526bd50
BLAKE2b-256 123f468d797eb556aae9975fdbe8c0a55af32d43230e2e65abd6360605eed6b3

See more details on using hashes here.

File details

Details for the file boat_torch-1.0-py3-none-any.whl.

File metadata

  • Download URL: boat_torch-1.0-py3-none-any.whl
  • Upload date:
  • Size: 52.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.8.20

File hashes

Hashes for boat_torch-1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 eb0cefe0709c894fe6464f6b04b950c042b9a9ed2a5e844e86bad636a95ecff8
MD5 0b98fff06b04985b73b96155f381b465
BLAKE2b-256 7dab8d7047a7fc8c46787ced4aec82fdca74bed836cf8159ed7c88f05574190d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page