pytorch_DANN icon indicating copy to clipboard operation
pytorch_DANN copied to clipboard

Evaluation does not suppress dropout, which will affect performance

Open Splend1d opened this issue 4 years ago • 0 comments

Hi there, thanks for the great repo! I would like to point out that I think the dropout function should be initialized at init in the in Class_classifier, so that. Your implementation uses dropout inplace.

logits = self.fc2(F.dropout(logits))

By adding self.dropout = nn.Dropout() during the initialization, and replace this term with

logits = self.fc2(logits) logits = self.dropout(logits)

I was able to obtain a 2~3% performance gain on the tasks using the same model checkpoint and inputs. Could you help me verify if my understanding is correct?

Splend1d avatar Nov 27 '20 07:11 Splend1d