bayesian-neural-network-blogpost icon indicating copy to clipboard operation
bayesian-neural-network-blogpost copied to clipboard

Building a Bayesian deep learning classifier

Results 13 bayesian-neural-network-blogpost issues
Sort by recently updated
recently updated
newest added

Hi, I have created a Bayesian CNN classifier as described in this repo, but my model's loss is always negative as well as the logits_variance_loss (see screenshot below). Any idea...

I tried running the ./bayesian-neural-network-blogpost/bin/download_model_info.py script to download the model along with the cifar10 data. But i was unable to access to access the s3 bucket . Could someone help...

Hi, I have been reading the document carefully, but do not quite get how to at least get things work?

The function definition is: tf.keras.backend.categorical_crossentropy( target, output, from_logits=False, axis=-1 ) But the code on line 68 is: undistorted_loss = K.categorical_crossentropy(pred, true, from_logits=True)

![image](https://user-images.githubusercontent.com/28092058/55728002-d4be5f00-5a45-11e9-8426-40cd5d5e3d36.png) The model trains with both logits_variance and softmax as the output. softmax is trained with the classification value What is logits variance trained with?

Hi, when modeling aleatoric uncertainty, why is `pred_var`, predicted logit values and variance's shape: (N, C + 1)? Shouldn't it be (N, C * 2)?

Hi, in _What Uncertainties Do We Need in Bayesian Deep Learning for Computer Vision_, it seems that when modeling aleatoric uncertainty, dropout is disabled, the model is just a normal...

Hi, I have read your explanation about Bayesian neural network, aleatoric uncertainty and epistemic uncertainty. It is very excellent and straightforward. But I also notice an interesting phenomenon that when...

`model.compile(optimizer=optimizers.Adam(lr=1e-3, decay=0.001), loss={'logits_variance': bayesian_categorical_crossentropy(100, 10), 'softmax_output': 'categorical_crossentropy'}, metrics={'softmax_output': metrics.categorical_accuracy}, loss_weights={'logits_variance': .2, 'softmax_output': 1.}) ` returns the "logits_variance" issue in the title, or the following: `TypeError: Expected float32, got of type...

Code is not working for results_dict['min loss'].append("{:.2E}".format(Decimal(Y[Z[min_var_idx,idx],idx])))