Focal-Loss-implement-on-Tensorflow icon indicating copy to clipboard operation
Focal-Loss-implement-on-Tensorflow copied to clipboard

Why add "neg_p_sub" into "per_entry_cross_ent"

Open CoderChang opened this issue 5 years ago • 1 comments

For multi-label cases, I think there is no need to consider neg_p_sub while computing per_entry_cross_ent. I think we shouldn't take every class as a binary classification and sum up their binary-cross-entropy losses, but take all classes as a whole to compute the multiclass-cross-entropy loss.

The original code:

per_entry_cross_ent = - alpha * (pos_p_sub ** gamma) * tf.log(tf.clip_by_value(sigmoid_p, 1e-8, 1.0)) \
                          - (1 - alpha) * (neg_p_sub ** gamma) * tf.log(tf.clip_by_value(1.0 - sigmoid_p, 1e-8, 1.0))

I think it should be:

per_entry_cross_ent = - alpha * (pos_p_sub ** gamma) * tf.log(tf.clip_by_value(sigmoid_p, 1e-8, 1.0))

CoderChang avatar Sep 21 '18 06:09 CoderChang

I have tried this, the author's method is better. Although I thought you are right.

lyuweiwang avatar Dec 05 '18 03:12 lyuweiwang