David Widmann
David Widmann
Yes, I think that's one of the things that could/should be moved from DynamicPPL/Turing to AbstractMCMC (maybe I misremember but it feels like there was also some discussion in some...
Maybe one should just use a special set of weighted observations, i.e., `condition(model, observations)` where the weights are included in `observations`, to keep the API consistent. Related: https://github.com/TuringLang/DynamicPPL.jl/issues/208
How do you run `sample` and what exactly do you want to do? If you want to perform Bayesian inference of `parameters` you would have to specify a prior for...
> But clearly I'm not trying to infer anything about obs, it's just my storage variable for my data, which I'm getting from findobsvar. Well, it's not clear from the...
Can you explain what errors you have in mind? Would you like Turing to catch any exceptions when calling the model? Or are you thinking about specific numerical issues? I...
Was fixed at some point, it seems.
Maybe I missed something (haven't checked this PR for a while) but I think @torfjelde's and my concerns above are still valid?
> Regarding bors: but is there a way to reset it? A bit late, but one can check the dashboard here if it is unclear if/what the problem is: https://app.bors.tech/repositories/24589...
The main issue is that, as mentioned above, currently the cross-validation example does not work: https://github.com/TuringLang/DynamicPPL.jl/actions/runs/3726471522/jobs/6319942583#step:5:248 Even some other example fail (e.g., https://github.com/TuringLang/DynamicPPL.jl/actions/runs/3726471522/jobs/6319942583#step:5:231) which, however, can be fixed easily by...
I would suggest the following: - Change the model to a Gaussian model with normal-inverse gamma prior - Write a function `bayes_loss(dataset)` - Use samples from the exact posterior instead...