UncertaintyNN icon indicating copy to clipboard operation
UncertaintyNN copied to clipboard

About dropout rate in training

Open kHan0809 opened this issue 4 years ago • 2 comments

Hi. I have a question.

I run your "coombined_evaluation.py" and it works well. There is a part I don't understand. In "combined_evaluation" function , you put the dropout values as 0.2 -(sess, x_placeholder, dropout_placeholder = combined_training(x, y, 0.2, learning_rate, epochs)) I thought even if I train the model, I have to set the dropout value same with prediction.

Am I wrong?

Sorry to my poor english and Thank you.

kHan0809 avatar May 24 '21 04:05 kHan0809

Hi,

yes, in theory you are right and we should only use the same dropout value than during training. If I remember correctly (and it was some time ago that I worked on this), I think I wanted to experiment what happens if you use a different dropout value during inference.

Maybe this follow-up work from Gal et al., where they optimize the dropout value during training, might be interesting to you

hutec avatar May 24 '21 16:05 hutec

Thank you for reply! I will read your recommendation!

kHan0809 avatar May 28 '21 02:05 kHan0809