Tutorials#
Optimization problems#
The following tutorials illustrate solving different types of optimization problems using Trieste.
- Introduction to Bayesian optimization
- Batch Bayesian optimization
- Thompson sampling
- Inequality constraints
- Explicit constraints
- Failure regions
- Multi-objective optimization
- Deep Gaussian processes
- Deep ensembles
- Active learning
- Active learning for binary classification
- Active learning of feasibility regions
- OpenAI Gym
- Scalable Thompson sampling
- Batching with Sharpe Ratio
- Multifidelity modelling
- High-dimensional Bayesian optimization
- Trust region Bayesian optimization
- Mixed search spaces
Frequently asked questions#
The following tutorials explain how to use and extend specific Trieste functionality.
How do I track and visualize an optimization loop in realtime using TensorBoard?
What are the key Python types used in Trieste and how can they be extended?
How do I externally control the optimization loop via an Ask-Tell interface?
How do I perform data transformations required for training the model?
How do I use Trieste in asynchronous objective evaluation mode?
Run the tutorials interactively#
The above tutorials are built from Jupytext notebooks in the notebooks directory of the trieste repository. These notebooks can also be run interactively. To do so, install the library from sources, along with additional notebook dependencies, with (in the repository root)
$ pip install . -r notebooks/requirements.txt
then run
$ jupyter-notebook notebooks
Alternatively, you can copy and paste the tutorials into fresh notebooks and avoid installing the library from source. To ensure you have the required plotting dependencies, simply run:
$ pip install trieste[plotting]