Packaged fork from uber-research/TuRBO
Project description
Overview
This is the code-release for the TuRBO algorithm from Scalable Global Optimization via Local Bayesian Optimization appearing in NeurIPS 2019. This is an implementation for the noise-free case and may not work well if observations are noisy as the center of the trust region should be chosen based on the posterior mean in this case.
Note that TuRBO is a minimization algorithm, so please make sure you reformulate potential maximization problems.
Benchmark functions
Robot pushing
The original code for the robot pushing problem is available at https://github.com/zi-w/Ensemble-Bayesian-Optimization. We have made the following changes to the code when running our experiments:
- We turned off the visualization, which speeds up the function evaluations.
- We replaced all instances of
np.random.normal(0, 0.01)
bynp.random.normal(0, 1e-6)
inpush_utils.py
. This makes the function close to noise-free. Another option is to average over several evaluations using the original code - We flipped the sign of the objective function to turn this into a minimization problem.
Dependencies: numpy
, pygame
, box2d-py
Rover
The original code for the robot pushing problem is available at https://github.com/zi-w/Ensemble-Bayesian-Optimization. We used the large version of the problem, which has 60 dimensions. We have flipped the sign of the objective function to turn this into a minimization problem.
Dependencies: numpy
, scipy
Lunar
The lunar code is available in the OpenAI gym: https://github.com/openai/gym. The goal of the problem is to learn the parameter values of a controller for the lunar lander. The controller we learn is a modification of the original heuristic controller which takes the form:
def heuristic_Controller(s, w):
angle_targ = s[0] * w[0] + s[2] * w[1]
if angle_targ > w[2]:
angle_targ = w[2]
if angle_targ < -w[2]:
angle_targ = -w[2]
hover_targ = w[3] * np.abs(s[0])
angle_todo = (angle_targ - s[4]) * w[4] - (s[5]) * w[5]
hover_todo = (hover_targ - s[1]) * w[6] - (s[3]) * w[7]
if s[6] or s[7]:
angle_todo = w[8]
hover_todo = -(s[3]) * w[9]
a = 0
if hover_todo > np.abs(angle_todo) and hover_todo > w[10]:
a = 2
elif angle_todo < -w[11]:
a = 3
elif angle_todo > +w[11]:
a = 1
return a
We use the constraints 0 <= w_i <= 2 for all parameters. We use INITIAL_RANDOM = 1500.0
to make the problem more challenging.
For more information about the logic behind this controller and how to integrate it with gym
, take a look at the original heuristic controller source code: https://github.com/openai/gym/blob/master/gym/envs/box2d/lunar_lander.py#L364
Dependencies: gym
, box2d-py
Cosmological constant
The code for the cosmological constant problem is available here: https://ascl.net/1306.012. You need to follow the instructions and compile the FORTRAN code. This gives you an executable CAMB
that you can call to run the simulation.
The parameter names and bounds that we tune are the following:
ombh2: [0.01, 0.25]
omch2: [0.01, 0.25]
omnuh2: [0.01, 0.25]
omk: [0.01, 0.25]
hubble: [52.5, 100]
temp_cmb: [2.7, 2.8]
hefrac: [0.2, 0.3]
mneu: [2.9, 3.09]
scalar_amp: [1.5e-9, 2.6e-8]
scalar_spec_ind: [0.72, 5]
rf_fudge: [0, 100]
rf_fudge_he: [0, 100]
Examples
Check the examples folder for two examples on how to use Turbo-1 and Turbo-n.
Citing us
The final version of the paper is available at: http://papers.nips.cc/paper/8788-scalable-global-optimization-via-local-bayesian-optimization.
@inproceedings{eriksson2019scalable,
title = {Scalable Global Optimization via Local {Bayesian} Optimization},
author = {Eriksson, David and Pearce, Michael and Gardner, Jacob and Turner, Ryan D and Poloczek, Matthias},
booktitle = {Advances in Neural Information Processing Systems},
pages = {5496--5507},
year = {2019},
url = {http://papers.nips.cc/paper/8788-scalable-global-optimization-via-local-bayesian-optimization.pdf},
}
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file uber_turbo-0.1.1.tar.gz
.
File metadata
- Download URL: uber_turbo-0.1.1.tar.gz
- Upload date:
- Size: 12.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.11.1 Darwin/23.4.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | de71cc2ab371fb7b3ac7c7d642a84ef09ed25018e828efc3c7dbae897a05b92f |
|
MD5 | 45cb02e96d912373a249e21aefb2c4d1 |
|
BLAKE2b-256 | d37a034d017827ee09b3b5308dd1e144e7134ff7d78258bdd86d31fead247c6d |
File details
Details for the file uber_turbo-0.1.1-py3-none-any.whl
.
File metadata
- Download URL: uber_turbo-0.1.1-py3-none-any.whl
- Upload date:
- Size: 14.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.11.1 Darwin/23.4.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | fef3878d3dc728b89096f75faa4b512280b776720ac51ec5236bfd02db7528a4 |
|
MD5 | 899bac7ebdef9c7beca2c6511ef06a78 |
|
BLAKE2b-256 | fe4d3ebe566b45fac8b3eb73d29c16e61aa43a564d952cf244501260b8e3c11c |