gpytorch icon indicating copy to clipboard operation
gpytorch copied to clipboard

Using RBFKernelGrad for 2D output

Open meakbiyik opened this issue 2 years ago • 4 comments

Hi, I figured that the discussions are not getting much attention here, and since there are already similar issues (#1388 , #1353), I decided to open another one.

Discussed in https://github.com/cornellius-gp/gpytorch/discussions/2085

Originally posted by meakbiyik July 31, 2022 Hi all!

I am trying to do some regression over 1D input to 2D output with a GP model. I also happen to have gradient information for the output dimensions, so the x is 1D (let's call it t as in time) but the y is 4D, with dimensions a, b, da/dt, db/dt.

I want to model this as a multitask GP with gradient information (so both the gradients and dimensions are modeled together), but I am not sure if this is even possible with GPyTorch. Here's an unsuccessful attempt that I got:

class MultiTaskGPModelWithDerivatives(gpytorch.models.ExactGP):
    def __init__(self, train_x, train_y, likelihood):
        super(MultiTaskGPModelWithDerivatives, self).__init__(train_x, train_y, likelihood)
        self.mean_module = gpytorch.means.MultitaskMean(
            gpytorch.means.ConstantMeanGrad(), num_tasks=2
        )
        self.covar_module = gpytorch.kernels.MultitaskKernel(
            gpytorch.kernels.RBFKernelGrad(), num_tasks=2, rank=2
        )

    def forward(self, x):
        mean_x = self.mean_module(x)
        covar_x = self.covar_module(x)
        return gpytorch.distributions.MultitaskMultivariateNormal(mean_x, covar_x)

with likelihood

gpytorch.likelihoods.MultitaskGaussianLikelihood(num_tasks=4)

Any help is appreciated!

meakbiyik avatar Jul 31 '22 18:07 meakbiyik

I don't know whether anyone has tried multitask + derivatives before. In theory it should be possible, but it might be something that would require some internal changes.

cc/ @dme65

gpleiss avatar Aug 07 '22 17:08 gpleiss

@gpleiss thanks a lot! Here's the closest I got (though I am really not knowledgeable with multi-output GP).

class MultiTaskGPModelWithDerivatives(gpytorch.models.ExactGP):
    def __init__(self, train_x, train_y, likelihood):
        super(MultiTaskGPModelWithDerivatives, self).__init__(
            train_x, train_y, likelihood
        )
        self.mean_module = gpytorch.means.MultitaskMean(
            gpytorch.means.ConstantMeanGrad(), num_tasks=2
        )
        self.covar_module = gpytorch.kernels.LCMKernel(
            [
                gpytorch.kernels.RBFKernelGrad(active_dims=[0, 1]),
                gpytorch.kernels.RBFKernelGrad(active_dims=[2, 3]),
            ],
            num_tasks=4,
            rank=1,
        )

    def forward(self, x):
        mean_x = self.mean_module(x)
        covar_x = self.covar_module(x)
        return gpytorch.distributions.MultitaskMultivariateNormal(
            mean_x.view(4, 4), covar_x,
        )

likelihood = gpytorch.likelihoods.MultitaskGaussianLikelihood(num_tasks=4)

This gives the following error for some fake input data with 4 samples:

RuntimeError: The expected shape of the kernel was torch.Size([16, 16]), but got torch.Size([32, 32]). This is likely a bug in GPyTorch.

meakbiyik avatar Aug 07 '22 18:08 meakbiyik

Hi,

Any success? Do you know of any examples that tried regression with multi-output GPs and their derivatives with gpytorch?

manish-pra avatar Feb 25 '24 08:02 manish-pra

No success sadly, I just fitted separate models to a, da/dt and b, db/dt.

meakbiyik avatar Feb 25 '24 20:02 meakbiyik