DORN_pytorch icon indicating copy to clipboard operation
DORN_pytorch copied to clipboard

Regression layer mismatch on train and test times

Open evinpinar opened this issue 4 years ago • 2 comments

It seems that while training, the regression prediction tensor is reshaped into [B, C, H, W]. Whereas in test time, only half of the output tensor is kept, resulting in [B, C/2, H, W] sized tensor. As in here Is there a reason for that, or is it a bug? I wish to calculate the validation loss, but it is not possible with the current setting. Should it be changed from: ord_prob = F.softmax(x, dim=1)[:, 0, :, :, :] to: prob = F.log_softmax(x, dim=1).view(N, C, H, W) ?

evinpinar avatar Dec 30 '20 13:12 evinpinar

yes

dontLoveBugs avatar Dec 30 '20 14:12 dontLoveBugs

It seems that while training, the regression prediction tensor is reshaped into [B, C, H, W]. Whereas in test time, only half of the output tensor is kept, resulting in [B, C/2, H, W] sized tensor. As in here Is there a reason for that, or is it a bug? I wish to calculate the validation loss, but it is not possible with the current setting. Should it be changed from: ord_prob = F.softmax(x, dim=1)[:, 0, :, :, :] to: prob = F.log_softmax(x, dim=1).view(N, C, H, W) ?

is that mean the code should be changed into: prob = F.log_softmax(x, dim=1).view(N, C, H, W) ord_label = torch.sum((prob > 0.5), dim=1) return prob, ord_label in other words,is that mean there is no need to distinguish between self.training and not self.training?

WBS-123 avatar Aug 08 '22 09:08 WBS-123