Pytorch-UNet icon indicating copy to clipboard operation
Pytorch-UNet copied to clipboard

Why is the loss combination of CrossEntropy and Dice used?

Open lferlings opened this issue 3 years ago • 2 comments

Hey, thanks for your great work on this project milesial. Could you (or someone seeing this) please explain to me why a combination of both the standard cross entropy loss and the dice loss is used? Thank you

lferlings avatar Mar 19 '22 16:03 lferlings

There's some info on this in this thread on stackexchange.

Yu-AnChen avatar Mar 31 '22 01:03 Yu-AnChen

Such a combination is mentioned in this paper https://arxiv.org/pdf/2006.14822.pdf :

J. Combo Loss
Combo loss [15] is defined as a weighted sum of Dice
loss and a modified cross entropy. It attempts to leverage the
flexibility of Dice loss of class imbalance and at same time
use cross-entropy for curve smoothing.

Maybe some learning curve smoothing related motivation, according to https://stats.stackexchange.com/a/344403/241224 ?

Edit: it seems indeed to balance optimization smoothing & segmentation quality evaluation, cf. https://pythonawesome.com/semantic-segmentation-models-datasets-and-losses-implemented-in-pytorch/ :

CE Dice loss, the sum of the Dice loss and CE, CE gives smooth optimization while Dice loss is a good indicator of the quality of the segmentation results.

Blupblupblup avatar Aug 29 '22 15:08 Blupblupblup

Above answers are correct

milesial avatar Dec 06 '22 19:12 milesial