FOTS.PyTorch icon indicating copy to clipboard operation
FOTS.PyTorch copied to clipboard

Detection loss should not divide all element

Open northeastsquare opened this issue 6 years ago • 4 comments

In loss.py, detection loss is: return torch.mean(L_g * y_true_cls * training_mask) + classification_loss But I think the loss mean should divide only nonzero(y_true_cls * training_mask), so may be the loss should write as: return torch.sum(L_g * y_true_cls * training_mask)/torch.nonzero(y_true_cls * training_mask).shape[0] + classification_loss

northeastsquare avatar Dec 17 '18 02:12 northeastsquare

L_g, y_true_cls, training_mask with the same shape: 641128*128

northeastsquare avatar Dec 17 '18 03:12 northeastsquare

@northeastsquare well, the loss Implementation is copied from east. Did not your model converge ?

jiangxiluning avatar Dec 17 '18 06:12 jiangxiluning

In loss.py, detection loss is: return torch.mean(L_g * y_true_cls * training_mask) + classification_loss But I think the loss mean should divide only nonzero(y_true_cls * training_mask), so may be the loss should write as: return torch.sum(L_g * y_true_cls * training_mask)/torch.nonzero(y_true_cls * training_mask).shape[0] + classification_loss

I agree with you. When I change the code as you've done, the detection loss seems norm as expected.

Train Epoch: 1431 [280/924 (30%)] Loss: 5.678974 Detection Loss: 1.112934 Recognition Loss:4.566040 Train Epoch: 1431 [420/924 (45%)] Loss: 5.101629 Detection Loss: 0.889204 Recognition Loss:4.212425 Train Epoch: 1431 [560/924 (61%)] Loss: 5.177139 Detection Loss: 0.797501 Recognition Loss:4.379638 Train Epoch: 1431 [700/924 (76%)] Loss: 5.403103 Detection Loss: 0.942712 Recognition Loss:4.460391 Train Epoch: 1431 [840/924 (91%)] Loss: 5.181108 Detection Loss: 0.844010 Recognition Loss:4.337098 epoch : 1431 loss : 5.390502048261238 precious : 0.0 recall : 0.0 hmean : 0.0 val_precious : 0.0 val_recall : 0.0 val_hmean : 0.0

MiZhangWhuer avatar Dec 17 '18 06:12 MiZhangWhuer

At beginning , the loss from about 16 , gradually, about 200 epoch, the loss get about 0.6, then not lower any more

northeastsquare avatar Dec 18 '18 06:12 northeastsquare