Turing.jl
Turing.jl copied to clipboard
Using Bayesian Inference for noise free Likelihood
Hello,
i have maybe a more type of general question. Is it still possible to perform Bayesian inference when the likelihood is basically deterministic. In my problem set up, i have a prior about some parameters and an associated simulation framework, which produces some output data. This framework is noise free and therefore deterministic. For given observations i want to find the right parameters of my simulations. Is there a way to do this in the Bayesian way, my only idea would be to force the likelihood to be a distribution with some really small variance. Thanks for the help :)
To me, the setup sounds like simulation-based inference. I'm not familiar with the field too much (just learnt some things recently) but there are definitely approaches for performing approximate Bayesian inference. Unfortunately, I'm not knowledgeable enough to give any recommendations regarding method etc., but maybe the link above and the references mentioned therein are helpful.
Interesting, thank you very much for the link, I will try to have look at it. Using the Dirac DElta function as Likelihood would not work right?
Using the Dirac DElta function as Likelihood would not work right?
Nah, it wouldn't. $p(\theta \mid x) \propto p(x | \theta) p(\theta)$ so if you make the likelihood a Delta
, you just end up completely ignoring the data, i.e. $p(\theta \mid x) \propto p(\theta)$.
Is there a way to do this in the Bayesian way
Nah, I don't think so. As @devmotion said, you probably want to look at likelihood-free inference instead :+1:
This is technically possible to support in Turing too, but we haven't gotten around to it yet.