Will Tebbutt

Results 602 comments of Will Tebbutt

I think I would like to see an example of how using this would look with / without this requirement. i.e. what would we currently do, vs what would we...

> One possible issue with your API is that it will be incompatible with likelihood requiring more than one GP, e.g. Softmax, heteroscedastic etc. From my experience it's better to...

> I am not sure. What I get via the augmentation is the analytical (stochastic if needed) gradient given the variational parameters, and in the non-stochastic case this translates into...

> That was not my main point though. It was more about the other way around. For example for heteroscedastic regression you will want to have 2 GPs (correlated or...

> Do you think its a good idea to start off by making this compatible with elliptical slice sampling as @devmotion suggested? I'm totally on board with this, and it's...

@sharanry 's initial attempt at the above in #3 is great, but it and a comment from him on slack have got me wondering about what I proposed above, in...

It became apparent when discussing approximate inference with pseudo-points that the above design can be a little annoying. See [here](https://github.com/JuliaGaussianProcesses/LatentGPs.jl/issues/4#issuecomment-636327748) for details. @sharanry what are your thoughts on this? I...

Yeah, sparse approximations are definitely something that we should include here. It's worth noting that we've already got some infrastructure for this in [AbstractGPs](https://github.com/JuliaGaussianProcesses/AbstractGPs.jl/blob/master/src/posterior_gp/approx_posterior_gp.jl). It might be worth considering how...

With our current API, the following is possible: ```julia # Set up prior and initialise approximate posterior. f = GP(k) f_approx_post = ApproxPosteriorGP(f, some_parameters_to_be_optimised) lf_approx_post = LatentGP(f_approx_post(x), BernoulliLikelihood()) # Maybe...