gpflux.sampling#
This module enables you to sample from (Deep) GPs efficiently and consistently.
Submodules#
Package Contents#
- class KernelWithFeatureDecomposition(kernel: gpflow.kernels.Kernel | NoneType, feature_functions: gpflow.keras.tf_keras.layers.Layer, feature_coefficients: gpflow.base.TensorType)[source]#
Bases:
gpflow.kernels.Kernel
This class represents a kernel together with its finite feature decomposition:
\[k(x, x') = \sum_{i=0}^L \lambda_i \phi_i(x) \phi_i(x'),\]where \(\lambda_i\) and \(\phi_i(\cdot)\) are the coefficients and features, respectively.
The decomposition can be derived from Mercer or Bochner’s theorem. For example, feature-coefficient pairs could be eigenfunction-eigenvalue pairs (Mercer) or Fourier features with constant coefficients (Bochner).
In some cases (e.g., [1] and [2]) the left-hand side (that is, the covariance function \(k(\cdot, \cdot)\)) is unknown and the kernel can only be approximated using its feature decomposition. In other cases (e.g., [3] and [4]), both the covariance function and feature decomposition are available in closed form.
- Parameters:
kernel –
The kernel corresponding to the feature decomposition. If
None
, there is no analytical expression associated with the infinite sum and we approximate the kernel based on the feature decomposition.Note
In certain cases, the analytical expression for the kernel is not available. In this case, passing
None
is allowed, andK()
andK_diag()
will be computed using the approximation provided by the feature decomposition.feature_functions – A Keras layer for which the call evaluates the
L
features of the kernel \(\phi_i(\cdot)\). ForX
with the shape[N, D]
,feature_functions(X)
returns a tensor with the shape[N, L]
.feature_coefficients – A tensor with the shape
[L, 1]
with coefficients associated with the features, \(\lambda_i\).
- property feature_functions: gpflow.keras.tf_keras.layers.Layer#
Return the kernel’s features \(\phi_i(\cdot)\).