Skip to main content

PyXAB - A Python Library for X-Armed Bandit and Online Blackbox Optimization Algorithms.

Project description

PyXAB - Python X-Armed Bandit

PyPI version Documentation Status Code style: black testing github-PyXAB forks github-PyXAB stars downloads github-PyXAB license Code style: black

PyXAB is a Python open-source library for X-armed bandit algorithms, a prestigious set of optimizers for online black-box optimization, i.e., optimize an objective without gradients, also known as the continuous-arm bandit (CAB), Lipschitz bandit, global optimization (GO) and bandit-based black-box optimization problems.

trajectory heatmap

PyXAB includes implementations of different X-armed bandit algorithms, including the classic ones such as HOO (Bubeck et al., 2011), StoSOO (Valko et al., 2013), and HCT (Azar et al., 2014), and the most recent works such as GPO (Shang et al., 2019) and VHCT (Li et al, 2021). PyXAB also provides the most commonly-used synthetic objectives to evaluate the performance of different algorithms and the implementations for different hierarchical partitions

Quick Links

Quick Example

First define the blackbox objective, the parameter domain, the partition of the space, and the algorithm, e.g.

target = Garland()
domain = [[0, 1]]
partition = BinaryPartition
algo = T_HOO(rounds=1000, domain=domain, partition=partition)

At every round t, call algo.pull(t) to get a point. After receiving the (stochastic) reward for the point, call algo.receive_reward(t, reward) to give the algorithm the feedback

point = algo.pull(t)
reward = target.f(point) + np.random.uniform(-0.1, 0.1) # Uniform noise example
algo.receive_reward(t, reward)

Documentations

Installation

To install via pip, run the following lines of code

pip install PyXAB                 # normal install
pip install --upgrade PyXAB       # or update if needed

To install via git, run the following lines of code

git clone https://github.com/WilliamLwj/PyXAB.git
cd PyXAB
pip install .

Features:

X-armed bandit algorithms

  • Algorithm starred are meta-algorithms (wrappers)
Algorithm Research Paper Year
T-HOO X-Armed Bandit 2011
DOO Optimistic Optimization of a Deterministic Function without the Knowledge of its Smoothness 2011
SOO Optimistic Optimization of a Deterministic Function without the Knowledge of its Smoothness 2011
StoSOO Stochastic Simultaneous Optimistic Optimization 2013
HCT Online Stochastic Optimization Under Correlated Bandit Feedback 2014
POO* Black-box optimization of noisy functions with unknown smoothness 2015
GPO* General Parallel Optimization Without A Metric 2019
PCT General Parallel Optimization Without A Metric 2019
SequOOL A Simple Parameter-free And Adaptive Approach to Optimization Under A Minimal Local Smoothness Assumption 2019
StroquOOL A Simple Parameter-free And Adaptive Approach to Optimization Under A Minimal Local Smoothness Assumption 2019
VHCT Optimum-statistical Collaboration Towards General and Efficient Black-box Optimization 2021
VPCT N.A. (GPO + VHCT) N.A.

Hierarchical partition

Partition Description
BinaryPartition Equal-size binary partition of the parameter space, the split dimension is chosen uniform randomly
RandomBinaryPartition Random-size binary partition of the parameter space, the split dimension is chosen uniform randomly
DimensionBinaryPartition Equal-size partition of the space with a binary split on each dimension, the number of children of one node is 2^d
KaryPartition Equal-size K-ary partition of the parameter space, the split dimension is chosen uniform randomly
RandomKaryPartition Random-size K-ary partition of the parameter space, the split dimension is chosen uniform randomly

Synthetic objectives

Objectives Image
Garland Garland
DoubleSine DoubleSine
DifficultFunc DifficultFunc
Ackley Ackley
Himmelblau Himmelblau
Rastrigin Rastrigin

Contributing

We appreciate all forms of help and contributions, including but not limited to

  • Star and watch our project
  • Open an issue for any bugs you find or features you want to add to our library
  • Fork our project and submit a pull request with your valuable codes

Please read the contributing instructions before submitting a pull request.

Citations

If you use our package in your research or projects, we kindly ask you to cite our work

@article{li2021optimum,
  title={Optimum-statistical Collaboration Towards General and Efficient Black-box Optimization},
  author={Li, Wenjie and Wang, Chi-Hua, Qifan Song and Cheng, Guang},
  journal={arXiv preprint arXiv:2106.09215},
  year={2021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

PyXAB-0.1.2.tar.gz (29.0 kB view hashes)

Uploaded Source

Built Distribution

PyXAB-0.1.2-py3-none-any.whl (45.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page