UncertaintyNN
UncertaintyNN copied to clipboard
About dropout rate in training
Hi. I have a question.
I run your "coombined_evaluation.py" and it works well. There is a part I don't understand. In "combined_evaluation" function , you put the dropout values as 0.2 -(sess, x_placeholder, dropout_placeholder = combined_training(x, y, 0.2, learning_rate, epochs)) I thought even if I train the model, I have to set the dropout value same with prediction.
Am I wrong?
Sorry to my poor english and Thank you.
Hi,
yes, in theory you are right and we should only use the same dropout value than during training. If I remember correctly (and it was some time ago that I worked on this), I think I wanted to experiment what happens if you use a different dropout value during inference.
Maybe this follow-up work from Gal et al., where they optimize the dropout value during training, might be interesting to you
Thank you for reply! I will read your recommendation!