[Bug] Error in sampling from LKJCovariancePrior
🐛 Bug
Error in sampling from LKJCovariancePrior
To reproduce
** Code snippet to reproduce **
n = 3
sd_prior = gpytorch.priors.GammaPrior(1.0, 1.0)
prior = gpytorch.priors.LKJCovariancePrior(n=n, eta=1.0, sd_prior=sd_prior)
samples = prior.sample(torch.Size([5]))
** Stack trace/error message **
Traceback (most recent call last):
File "test_script.py, line 12, in <module>
samples = prior.sample(torch.Size([5]))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File ".../lib/python3.11/site-packages/gpytorch/priors/lkj_prior.py", line 115, in sample
return marginal_sds.matmul(base_correlation).matmul(marginal_sds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: Expected size for first two dimensions of batch2 tensor to be: [5, 15] but got: [5, 3].
Expected Behavior
I'd expect this to sample a (5,3,3) tensor of five 3x3 covariance matrices.
System information
Please complete the following information:
- GPyTorch version 1.14
- torch version 2.4.1
- MacOS Tahoe 26.1
Additional context
Something like
if marginal_sds.shape == sample_shape:
marginal_sds = marginal_sds.unsqueeze(-1) # ensure marginal_sds has a feature dimension for broadcasting
marginal_sds = marginal_sds.expand(*sample_shape, self.correlation_prior.n)
seems to work, but causes downstream tests to fail.
@lpapenme thanks for reporting the bug!
It appears that the LKJCovariancePrior works fine for SmoothedBoxPrior, but not for the gamma prior.
# sd_prior = gpytorch.priors.GammaPrior(1.0, 1.0) # bug
sd_prior = gpytorch.priors.SmoothedBoxPrior(0.1, 2.0) # works fine
prior = gpytorch.priors.LKJCovariancePrior(n=3, eta=1.0, sd_prior=sd_prior)
samples = prior.sample(torch.Size([5]))
It seems that GPyTorch has inconsistent sample shapes even though both the gamma prior and the smoothed box prior are univariate priors.
gamma_prior = gpytorch.priors.GammaPrior(1.0, 1.0)
samples = gamma_prior.sample(torch.Size([5]))
print(samples.shape) # torch.Size([5])
smoothed_box_prior = gpytorch.priors.SmoothedBoxPrior(0.1, 2.0)
samples = smoothed_box_prior.sample(torch.Size([5]))
print(samples.shape) # torch.Size([5, 1])
It looks like SmoothedBoxPrior is the only prior that outputs a different sample shape, which makes me think there is a bug in SmoothedBoxPrior (separate from this issue). So we need to fix that as well.
The unit tests didn't catch the bug in this issue, because we've been using SmoothedBoxPrior in the test cases of LKJCovariancePrior.
import torch
from gpytorch.priors import (
GammaPrior,
HalfCauchyPrior,
HalfNormalPrior,
LogNormalPrior,
NormalPrior,
SmoothedBoxPrior,
UniformPrior,
)
for prior in [
NormalPrior(1.0, 1.0),
GammaPrior(1.0, 1.0),
HalfCauchyPrior(1.0, 1.0),
HalfNormalPrior(1.0, 1.0),
LogNormalPrior(1.0, 1.0),
UniformPrior(1.0, 2.0),
SmoothedBoxPrior(0.1, 2.0),
]:
samples = prior.sample(torch.Size([5]))
print(samples.shape)
# outputs:
# torch.Size([5])
# torch.Size([5])
# torch.Size([5])
# torch.Size([5])
# torch.Size([5])
# torch.Size([5])
# torch.Size([5, 1])
cc @jacobrgardner @gpleiss @Balandat
#2686 would fix the bug. Feel free to make comments in the PR.
FWIW #1317 is an ancient issue that has some discussion on this :(. There is also a comment specifically on the SmoothboxPrior issue here.