Bayesian-Neural-Networks icon indicating copy to clipboard operation
Bayesian-Neural-Networks copied to clipboard

Question: log_gaussian_loss function used in MC Dropout and SGLD

Open acse-aol21 opened this issue 3 years ago • 1 comments

Firstly, thank you for all these great notebooks, they've been very helpful in building a better understanding of these methods.

I am wondering where the function log_gaussian_loss originates from? I'm struggling to find reference to it in the literature, though very likely looking in the wrong places.

In MC dropout it seems that one output neuron is for the prediction and another that feeds into this loss function, and I'm struggling to get a simpler version working whereby there's one output neuron and a different loss function. Where does this technique originate from?

Thanks again

acse-aol21 avatar Jul 28 '22 13:07 acse-aol21

log_gaussian_loss is just the log of the Gaussian probability density. You will see that this looks like a squared error loss with a regularisation term which is just the log of the variance. Most commonly, the likelihood variance is fixed to 1 and this regulariser drops out.

This type of loss is very widely used in the literature. However, it is not often written out explicitly that the log-Gaussian is just scaled squared error minus log variance.

Hope this clarifies things, Javier

JavierAntoran avatar Aug 16 '22 16:08 JavierAntoran