Loss_ToolBox-PyTorch
Loss_ToolBox-PyTorch copied to clipboard
issue on BinaryFocalLoss
I have two questions:
- Why you are applying sigmoid at the beginning of fow$rward call?
- Is label smoothing is correct? Should we apply to target rather than output?
- Why you are not dividing neg_loss by num_neg when num_pos == 0?
This code has several deficiencies and you have highlighted some of those.
- Why you are applying sigmoid at the beginning of forward call?
There is no harm in assuming that the model @Hsuxu used gave logits as output (i.e. no sigmoid/softmax function was used) so the first thing one will do before calculating any loss is to use an activation function.
Let's hope @Hsuxu will answer the rest of the question soon!