pytorch-retinanet icon indicating copy to clipboard operation
pytorch-retinanet copied to clipboard

focal_loss_alt VS focal_loss

Open LifeBeyondExpectations opened this issue 7 years ago • 3 comments

Why there are two version of focal loss methods in _"class FocalLoss(nn.Module): URL: https://github.com/kuangliu/pytorch-retinanet/blob/2199fd9711fd787ae409800a499db73e6d466fd7/loss.py" ????

LifeBeyondExpectations avatar Jun 13 '18 06:06 LifeBeyondExpectations

focal_loss_alt() is a better implementation, it uses a simpler implementation of F.binary_cross_entropy(). Besides in torch 0.4.1 binary_cross_entropy() does not perform backprop. for the weight variable which is crucial for the focal loss since the weight variable includes the modulating factor.

bitcurian avatar Mar 22 '19 14:03 bitcurian

It seems the focal_loss_alt() function doesn't use the parameter gamma. Is that an oversight or has it been implicitly taken care of and I am missing something?

sahu-ji avatar Aug 21 '19 23:08 sahu-ji

Was wondering if there is a reference for this alternate loss?

gunshi avatar Sep 25 '19 18:09 gunshi