gpytorch icon indicating copy to clipboard operation
gpytorch copied to clipboard

support for non-stationary GP

Open torabin opened this issue 4 years ago • 4 comments

Hi GPytorch developers,

I was wondering whether GPytorch can be used with a custom Kernel where new hyper-parameters are introduced to the kernel? To be more specific, I want to extend my common RBF kernel with a temporal kernel (to support non-stationary latent function). So my new kernel would be in a form of G^|t - t'| * K(x, x') 0 < G <= 1 where G is a new hyper-parameter (t, t' are the time stamp related to x and x'). So, I would like to know whether it is possible to use GPytorch for this case or not. Thanks in advance

torabin avatar Jan 21 '21 18:01 torabin

This should be pretty straightforward to do. You wouldn't necessarily define this as a new compound kernel but define the single-dimensional G^|t - t'| kernel, and then in the model's forward method multiply the kernels. So unlike what's done in https://github.com/cornellius-gp/gpytorch/blob/master/examples/01_Exact_GPs/Simple_GP_Regression.ipynb you'd do something like

   def forward(self, x, t):
       mean = self.mean_module(x)
       covar = self.base_covar_module(x) * self.temporal_covar_module(t)
       return gpytorch.distributions.MultivariateNormal(mean, covar)

and you'll need to make sure that your train inputs are tuples of tensors (x, t). Here temporal_covar_module would be the new kernel you define (this could be modeled on ScaleKernel but have the multiplicative factor depend on the intput t.

Balandat avatar Jan 21 '21 19:01 Balandat

This should be pretty straightforward to do. You wouldn't necessarily define this as a new compound kernel but define the single-dimensional G^|t - t'| kernel, and then in the model's forward method multiply the kernels. So unlike what's done in https://github.com/cornellius-gp/gpytorch/blob/master/examples/01_Exact_GPs/Simple_GP_Regression.ipynb you'd do something like

   def forward(self, x, t):
       mean = self.mean_module(x)
       covar = self.base_covar_module(x) * self.temporal_covar_module(t)
       return gpytorch.distributions.MultivariateNormal(mean, covar)

and you'll need to make sure that your train inputs are tuples of tensors (x, t). Here temporal_covar_module would be the new kernel you define (this could be modeled on ScaleKernel but have the multiplicative factor depend on the intput t.

Is there any example of temporal_covar_module (i.e. how to define a kernel dependent on t)? @torabin @Balandat

ginward avatar Jan 21 '21 21:01 ginward

@ginward look at these docs on implementing custom kernels: https://docs.gpytorch.ai/en/latest/examples/00_Basic_Usage/Implementing_a_custom_Kernel.html

gpleiss avatar Feb 10 '21 23:02 gpleiss

Hi GPytorch developers,

I was wondering whether GPytorch can be used with a custom Kernel where new hyper-parameters are introduced to the kernel? To be more specific, I want to extend my common RBF kernel with a temporal kernel (to support non-stationary latent function). So my new kernel would be in a form of G^|t - t'| * K(x, x') 0 < G <= 1 where G is a new hyper-parameter (t, t' are the time stamp related to x and x'). So, I would like to know whether it is possible to use GPytorch for this case or not. Thanks in advance

Did you end up doing this?

sanaamouzahir avatar Dec 12 '23 18:12 sanaamouzahir