gpflux.losses#
This module provides the LikelihoodLoss adapter to use GPflow’s
Likelihood implementations as a
tf.keras.losses.Loss.
Module Contents#
- unwrap_dist(dist: tfp.distributions.Distribution) tfp.distributions.Distribution[source]#
Unwrap the given distribution, if it is wrapped in a
_TensorCoercible.
- class LikelihoodLoss(likelihood: gpflow.likelihoods.Likelihood)[source]#
Bases:
gpflow.keras.tf_keras.losses.LossThis class is a
tf.keras.losses.Lossimplementation that wraps a GPflowLikelihoodinstance.When the prediction (last-layer output) is a
Distributionq(f), calling this loss returns the negative variational expectation \(-\mathbb{E}_{q(f)}[\log p(y|f)]\). When the prediction is atf.Tensor, calling this loss returns the negative log-probability \(-\log p(y|f)\).When you use this loss function in training a Keras model, the value of this loss is not logged explicitly (in contrast, the layer-specific losses are logged, as is the overall model loss). To output this loss value explicitly, wrap this class in a
tf.keras.metrics.Metricand add it to the model metrics.Note
Use either this
LikelihoodLoss(e.g. together with atf.keras.Sequentialmodel) orLikelihoodLayer(together withgpflux.models.DeepGP). Do not use both at once because this would add the loss twice.- Parameters:
likelihood –
the GPflow likelihood object to use.
Note
If you want to train any parameters of the likelihood (e.g. likelihood variance), you must include the likelihood as an attribute on a
TrackableLayerinstance that is part of your model. (This is not required when instead you use agpflux.layers.LikelihoodLayertogether withgpflux.models.DeepGP.)
- call(y_true: gpflow.base.TensorType, f_prediction: gpflow.base.TensorType | tfp.distributions.MultivariateNormalDiag) tf.Tensor[source]#
Note that we deviate from the Keras Loss interface by calling the second argument f_prediction rather than y_pred.