Skip to main content

A Bayesian optimization research toolbox built on TensorFlow

Project description

Trieste

PyPI License Release Develop Codecov Slack Status

Documentation (develop) | Documentation (release) | Tutorials | API reference |

What does Trieste do?

Trieste (pronounced tree-est) is a Bayesian optimization toolbox built on TensorFlow. Trieste is named after the bathyscaphe Trieste, the first vehicle to take a crew to Challenger Deep in the Mariana Trench, the lowest point on the Earth’s surface: the literal global minimum.

Why Trieste?

  • Highly modular design and easily customizable. Extend it with your custom model or acquisition functions. Ideal for practitioners that want to use it in their systems or for researchers wishing to implement their latest ideas.
  • Seamless integration with TensorFlow. Leveraging fully its auto differentiation - no more writing of gradients for your acquisition functions!, and scalability capabilities via its support for highly parallelized modern hardware (e.g. GPUs).
  • General purpose toolbox. Advanced algorithms covering all corners of Bayesian optimization and Active learning - batch, asynchronous, constraints, multi-fidelity, multi-objective - you name it, Trieste has it.
  • Versatile model support out-of-the-box. From gold-standard Gaussian processes (GPs; GPflow) to alternatives like sparse variational GPs, Deep GPs (GPflux) or Deep Ensembles (Keras), that scale much better with the number of function evaluations.
  • Real-world oriented. Our Ask-Tell interface allows users to apply Bayesian optimization across a range of non-standard real-world settings where control over black-box function is partial. Built on TensorFlow and with comprehensive testing Trieste is production-ready.

Getting started

Here's a quick overview of the main components of a Bayesian optimization loop. For more details, see our Documentation where we have multiple Tutorials covering both the basic functionalities of the toolbox, as well as more advanced usage.

Let's set up a synthetic black-box objective function we wish to minimize, for example, a popular Branin optimization function, and generate some initial data

from trieste.objectives import Branin, mk_observer

observer = mk_observer(Branin.objective)

initial_query_points = Branin.search_space.sample(5)
initial_data = observer(initial_query_points)

First step is to create a probabilistic model of the objective function, for example a Gaussian Process model

from trieste.models.gpflow import build_gpr, GaussianProcessRegression

gpflow_model = build_gpr(initial_data, Branin.search_space)
model = GaussianProcessRegression(gpflow_model)

Next ingredient is to choose an acquisition rule and acquisition function

from trieste.acquisition import EfficientGlobalOptimization, ExpectedImprovement

acquisition_rule = EfficientGlobalOptimization(ExpectedImprovement())

Finally, we optimize the acquisition function using our model for a number of steps and we check the obtained minimum

from trieste.bayesian_optimizer import BayesianOptimizer

bo = BayesianOptimizer(observer, Branin.search_space)
num_steps = 15
result = bo.optimize(num_steps, initial_data, model)
query_point, observation, arg_min_idx = result.try_get_optimal_point()

Installation

Trieste supports Python 3.9+ and TensorFlow 2.5+, and uses semantic versioning.

For users

To install the latest (stable) release of the toolbox from PyPI, use pip:

$ pip install trieste

or to install from sources, run

$ pip install .

in the repository root.

For contributors

To install this project in editable mode, run the commands below from the root directory of the trieste repository.

git clone https://github.com/secondmind-labs/trieste.git
cd trieste
pip install -e .

For installation to be able to run quality checks, as well as other details, see the guidelines for contributors.

Tutorials

Trieste has a documentation site with tutorials on how to use the library, and an API reference. You can also run the tutorials interactively. They can be found in the notebooks directory, and are written as Python scripts for running with Jupytext. To run them, first install trieste from sources as above, then install additional dependencies with

$ pip install -r notebooks/requirements.txt

Finally, run the notebooks with

$ jupyter-notebook notebooks

Alternatively, you can copy and paste the tutorials into fresh notebooks and avoid installing the library from source. To ensure you have the required plotting dependencies, simply run:

$ pip install trieste[plotting]

Importing Keras

Like tensorflow-probability, Trieste currently uses Keras 2. When using Tensorflow versions 2.16 onwards (which default to Keras 3) this needs to be imported from tf_keras rather than tf.keras. Alternatively, for a shortcut that works with all versions of Tensorflow, you can write:

from gpflow.keras import tf_keras

The Trieste Community

Getting help

Bugs, feature requests, pain points, annoying design quirks, etc: Please use GitHub issues to flag up bugs/issues/pain points, suggest new features, and discuss anything else related to the use of Trieste that in some sense involves changing the Trieste code itself. We positively welcome comments or concerns about usability, and suggestions for changes at any level of design. We aim to respond to issues promptly, but if you believe we may have forgotten about an issue, please feel free to add another comment to remind us.

Slack workspace

We have a public Secondmind Labs slack workspace. Please use this invite link and join the #trieste channel, whether you'd just like to ask short informal questions or want to be involved in the discussion and future development of Trieste.

Contributing

All constructive input is very much welcome. For detailed information, see the guidelines for contributors.

Citing Trieste

To cite Trieste, please reference our arXiv paper where we review the framework and describe the design. Sample Bibtex is given below:

@misc{trieste2023,
  author = {Picheny, Victor and Berkeley, Joel and Moss, Henry B. and Stojic, Hrvoje and Granta, Uri and Ober, Sebastian W. and Artemev, Artem and Ghani, Khurram and Goodall, Alexander and Paleyes, Andrei and Vakili, Sattar and Pascual-Diaz, Sergio and Markou, Stratis and Qing, Jixiang and Loka, Nasrulloh R. B. S and Couckuyt, Ivo},
  title = {Trieste: Efficiently Exploring The Depths of Black-box Functions with TensorFlow},
  publisher = {arXiv},
  year = {2023},
  doi = {10.48550/ARXIV.2302.08436},
  url = {https://arxiv.org/abs/2302.08436}
}

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trieste-4.2.2.tar.gz (230.8 kB view details)

Uploaded Source

Built Distribution

trieste-4.2.2-py3-none-any.whl (277.8 kB view details)

Uploaded Python 3

File details

Details for the file trieste-4.2.2.tar.gz.

File metadata

  • Download URL: trieste-4.2.2.tar.gz
  • Upload date:
  • Size: 230.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for trieste-4.2.2.tar.gz
Algorithm Hash digest
SHA256 c18be336788933007166c6e33debed4637fe5c77aa5a3d64464d680df5c3ec28
MD5 5d3262e146edafc8981e139d0e258588
BLAKE2b-256 66e6a210d450d7f2ba7d7b0acddb478b0f61e667d68600dd70725a2a88befb87

See more details on using hashes here.

File details

Details for the file trieste-4.2.2-py3-none-any.whl.

File metadata

  • Download URL: trieste-4.2.2-py3-none-any.whl
  • Upload date:
  • Size: 277.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for trieste-4.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 34cb92c435900706e9474e927cc1d6d8220276b8c9993c95befb91e31766d070
MD5 67d35f8634650a85c5720cd6dbde12a2
BLAKE2b-256 e79c1c94df1355661c04758d1a749101f51d20a0b5f36c1a21cd649697d27b62

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page