Class-balanced-loss-pytorch icon indicating copy to clipboard operation
Class-balanced-loss-pytorch copied to clipboard

why modulator?

Open lai199508 opened this issue 4 years ago • 2 comments

Hi, i'm interested in your work! Now i have problem, why your implement code of focal loss use "modulator = torch.exp(-gamma * labels * logits - gamma * torch.log(1 + torch.exp(-1.0 * logits)))" ?

lai199508 avatar Apr 14 '20 09:04 lai199508

it's just a transfer of the formula,but "labels" should not show in the code.i think it is modulator = torch.exp(-gamma * logits - gamma * torch.log(1 + torch.exp(-1.0 * logits)))

xiaoyu825 avatar May 03 '20 01:05 xiaoyu825

it's the formula of focal loss, and "labels" should be reserved. image notice that y' = 1 / (1 + torch.exp(-1.0 * logits)), and both logits and labels are matrixs.

for where labels is 0, modulator = (1 / (1 + torch.exp(-1.0 * logits))) ** gamma, and for where labels is 1, modulator = (torch.exp(-1.0 * logits) / (1 + torch.exp(-1.0 * logits)) ** gamma = (1 / (1 + torch.exp(logits)) ** gamma

matching the formula well.

wutong16 avatar May 12 '20 01:05 wutong16