Wesley Maddox
Wesley Maddox
The included `10` here should be the number of quadrature samples to evaluate `E_{q(f)}(\log p(y| f))`. I believe these samples should automatically go way when you evaluate the likelhood.
What exactly do you mean? Do you mean that you only have access to `Cov(X_train, X_train)` or that the input (e.g. `X_train`) itself is a covariance matrix?
How do you intend to make predictions then? Do you also have access to `Cov(X_train, X_test)` as well?
i'm pretty sure that's close to the correct number but off by a scaling factor equal to the size of the dataset, https://github.com/wjmaddox/swa_gaussian/blob/61a918fa1c3e732dd19fc278ad3a5969f24d5a72/experiments/uncertainty/uncertainty.py#L214 scaling factor is 10k, which produces a...
in general, thats going to break the online nature of the algorithm so that's most likely why it's throwing errors. I suppose one could do a meta-learning like strategy if...
that's correct. however, if you fix the W matrix, you can still move the inputs around to some extent
Ugh, yes, that looks like some sort of improper dimension based error. I'll try to take a closer look this weekend. Does it work if you enforce the dimension?
Ugh, yes, not surprising that it no longer works, especially as the core contribution (fantasization) here is merged into gpytorch for gaussian likelihoods, but not botorch (one day i'll publish...
I pushed a fix and it ran locally for me. Sorry for the long delay there.