spateo.tdr.interpolations.interpolation_gaussianprocess ======================================================= .. py:module:: spateo.tdr.interpolations.interpolation_gaussianprocess Submodules ---------- .. toctree:: :maxdepth: 1 /autoapi/spateo/tdr/interpolations/interpolation_gaussianprocess/gp_models/index /autoapi/spateo/tdr/interpolations/interpolation_gaussianprocess/gp_train/index Classes ------- .. autoapisummary:: spateo.tdr.interpolations.interpolation_gaussianprocess.Approx_GPModel spateo.tdr.interpolations.interpolation_gaussianprocess.Exact_GPModel Functions --------- .. autoapisummary:: spateo.tdr.interpolations.interpolation_gaussianprocess.gp_train Package Contents ---------------- .. py:class:: Approx_GPModel(inducing_points) Bases: :py:obj:`gpytorch.models.ApproximateGP` The base class for any Gaussian process latent function to be used in conjunction with approximate inference (typically stochastic variational inference). This base class can be used to implement most inducing point methods where the variational parameters are learned directly. :param ~gpytorch.variational._VariationalStrategy variational_strategy: The strategy that determines how the model marginalizes over the variational distribution (over inducing points) to produce the approximate posterior distribution (over data) The :meth:`forward` function should describe how to compute the prior latent distribution on a given input. Typically, this will involve a mean and kernel function. The result must be a :obj:`~gpytorch.distributions.MultivariateNormal`. .. rubric:: Example >>> class MyVariationalGP(gpytorch.models.PyroGP): >>> def __init__(self, variational_strategy): >>> super().__init__(variational_strategy) >>> self.mean_module = gpytorch.means.ZeroMean() >>> self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel()) >>> >>> def forward(self, x): >>> mean = self.mean_module(x) >>> covar = self.covar_module(x) >>> return gpytorch.distributions.MultivariateNormal(mean, covar) >>> >>> # variational_strategy = ... >>> model = MyVariationalGP(variational_strategy) >>> likelihood = gpytorch.likelihoods.GaussianLikelihood() >>> >>> # optimization loop for variational parameters... >>> >>> # test_x = ...; >>> model(test_x) # Returns the approximate GP latent function at test_x >>> likelihood(model(test_x)) # Returns the (approximate) predictive posterior distribution at test_x .. py:attribute:: variational_distribution .. py:attribute:: variational_strategy .. py:attribute:: mean_module .. py:attribute:: covar_module .. py:method:: forward(x) .. py:class:: Exact_GPModel(train_x, train_y, likelihood) Bases: :py:obj:`gpytorch.models.ExactGP` The base class for any Gaussian process latent function to be used in conjunction with exact inference. :param torch.Tensor train_inputs: (size n x d) The training features :math:`\mathbf X`. :param torch.Tensor train_targets: (size n) The training targets :math:`\mathbf y`. :param ~gpytorch.likelihoods.GaussianLikelihood likelihood: The Gaussian likelihood that defines the observational distribution. Since we're using exact inference, the likelihood must be Gaussian. The :meth:`forward` function should describe how to compute the prior latent distribution on a given input. Typically, this will involve a mean and kernel function. The result must be a :obj:`~gpytorch.distributions.MultivariateNormal`. Calling this model will return the posterior of the latent Gaussian process when conditioned on the training data. The output will be a :obj:`~gpytorch.distributions.MultivariateNormal`. .. rubric:: Example >>> class MyGP(gpytorch.models.ExactGP): >>> def __init__(self, train_x, train_y, likelihood): >>> super().__init__(train_x, train_y, likelihood) >>> self.mean_module = gpytorch.means.ZeroMean() >>> self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel()) >>> >>> def forward(self, x): >>> mean = self.mean_module(x) >>> covar = self.covar_module(x) >>> return gpytorch.distributions.MultivariateNormal(mean, covar) >>> >>> # train_x = ...; train_y = ... >>> likelihood = gpytorch.likelihoods.GaussianLikelihood() >>> model = MyGP(train_x, train_y, likelihood) >>> >>> # test_x = ...; >>> model(test_x) # Returns the GP latent function at test_x >>> likelihood(model(test_x)) # Returns the (approximate) predictive posterior distribution at test_x .. py:attribute:: mean_module .. py:attribute:: covar_module .. py:method:: forward(x) .. py:function:: gp_train(model, likelihood, train_loader, train_epochs, method, N, device)