Skip to main content

A Bilevel Optimizer Library in Python for Meta Learning

Reason this release was yanked:

beta version with wrong settings

Project description

Configuration & Status

build status codecov Documentation Status License Language Code style: black

BOML is a modularized optimization library that unifies several ML algorithms into a common bilevel optimization framework. It provides interfaces to implement popular bilevel optimization algorithms, so that you could quickly build your own meta learning neural network and test its performance.

Key features of BOML

  • Unified bilevel optimization framework to address different categories of existing meta-learning paradigms.

  • Modularized algorithmic structure to integrate a variety of optimization techniques and popular methods.

  • Unit tests with Travis CI and Codecov to reach 99% coverage, and following PEP8 naming convention to guarantee the code quality.

  • Comprehensive documentations using sphinx and flexible functional interfaces similar to conventional optimizers to help researchers quickly get familiar with the procedures.

Optimization Routine

The figure below illustrates the general optimization routine by organized modules in BOML.

Bilevel Optimization Routine

Related Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

boml-0.1.0.tar.gz (61.9 kB view hashes)

Uploaded Source

Built Distribution

boml-0.1.0-py3-none-any.whl (86.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page