DropoutUncertaintyExps icon indicating copy to clipboard operation
DropoutUncertaintyExps copied to clipboard

Experiments used in "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning"

Results 8 DropoutUncertaintyExps issues
Sort by recently updated
recently updated
newest added

I just pulled this repo and needed to make the following changes to be able to run `experiment.py` using tensorflow==2.8.0, keras==2.8.0, and scipy==1.8.0. Figured I would offer a merge but...

Hello Yarin, I tried running your code but I found that there is a little discrepancy in the results. For instance, I took Boston Housing and I ran the same...

Hello Yarin, Is there any way to interpret the obtained Predictive Uncertainty(Variance)? After computing the predictive variance i.e. the sample variance of T stochastic forward passes is there any way...

https://github.com/yaringal/DropoutUncertaintyExps/blob/6eb4497628d12b0f300f4b4f6bdc386bebad565c/net/net.py#L9 Should be changed to: ```python from scipy.special import logsumexp ``` reference: https://github.com/cvxgrp/cvxpy/issues/640

Hello Yarin, It looks like that the description of the outputs in your `predict` method of the `net` class does not match to the actual output. https://github.com/yaringal/DropoutUncertaintyExps/blob/6eb4497628d12b0f300f4b4f6bdc386bebad565c/net/net.py#L95-L108 According to your...

When normalizing the output, shouldn't the pre-defined model precision also normalized?

I don't quite understand the calculation of the log-likelihood ``` # We compute the test log-likelihood ll = (logsumexp(-0.5 * self.tau * (y_test[None] - Yt_hat)**2., 0) - np.log(T) - 0.5*np.log(2*np.pi)...

Any suggestions on how to implement the stochastic predictor with a _different_ dropout rate than that which was used in training? I have tried to modify the layer attributes (.rate),...