Loss_ToolBox-PyTorch icon indicating copy to clipboard operation
Loss_ToolBox-PyTorch copied to clipboard

issue on BinaryFocalLoss

Open quancore opened this issue 4 years ago • 1 comments

I have two questions:

  1. Why you are applying sigmoid at the beginning of fow$rward call?
  2. Is label smoothing is correct? Should we apply to target rather than output?
  3. Why you are not dividing neg_loss by num_neg when num_pos == 0?

quancore avatar May 20 '20 21:05 quancore

This code has several deficiencies and you have highlighted some of those.

  1. Why you are applying sigmoid at the beginning of forward call?

There is no harm in assuming that the model @Hsuxu used gave logits as output (i.e. no sigmoid/softmax function was used) so the first thing one will do before calculating any loss is to use an activation function.

Let's hope @Hsuxu will answer the rest of the question soon!

shahzad-ali avatar Oct 08 '20 15:10 shahzad-ali