Tutorials#
Example optimization problems#
The following tutorials explore various optimization problems using Trieste.
- Noise-free optimization with Expected Improvement
- Batch Bayesian Optimization with Batch Expected Improvement, Local Penalization, Kriging Believer and GIBBON
- Batch-sequential optimization with Thompson sampling
- Inequality constraints
- EGO with a failure region
- Multi-objective optimization with Expected HyperVolume Improvement
- Using deep Gaussian processes with GPflux for Bayesian optimization.
- Bayesian optimization with deep ensembles
- Active Learning
- Active Learning for Gaussian Process Classification Model
- Bayesian active learning of failure or feasibility regions
- Trieste meets OpenAI Gym
- Scalable Thompson Sampling using Sparse Gaussian Process Models
Frequently asked questions#
The following tutorials (or sections thereof) explain how to use and extend specific Trieste functionality.
How do I track and visualize an optimization loop in realtime using TensorBoard?
How do I perform data transformations required for training the model?
How do I use Trieste in asynchronous objective evaluation mode?
Run the tutorials interactively#
The above tutorials are built from Jupytext notebooks in the notebooks directory of the trieste repository. These notebooks can also be run interactively. To do so, install the library from sources, along with additional notebook dependencies, with (in the repository root)
$ pip install . -r notebooks/requirements.txt
then run
$ jupyter-notebook notebooks
Alternatively, you copy and paste the tutorials into fresh notebooks and avoid installing the library from source. To ensure you have the required plotting dependencies, simply run:
$ pip install trieste[plotting]