LDAM-DRW icon indicating copy to clipboard operation
LDAM-DRW copied to clipboard

About the LDAM Loss

Open sakumashirayuki opened this issue 4 years ago • 4 comments

Thanks for your code a lot! I have read your paper and code, it's really a good idea, but here I have a question about LDAM Loss. It's in the last line where we call the basic cross_entropy function in pytorch.

    def forward(self, x, target):
        index = torch.zeros_like(x, dtype=torch.uint8)
        index.scatter_(1, target.data.view(-1, 1), 1)

        index_float = index.type(torch.cuda.FloatTensor)
        # self.m_list[None, :] add one dimension to the origin m_list
        batch_m = torch.matmul(self.m_list[None, :], index_float.transpose(0, 1))
        # equivalently transpose
        batch_m = batch_m.view((-1, 1))
        x_m = x - batch_m
        # only the target labelpostion is x_m
        output = torch.where(index, x_m, x)
        return F.cross_entropy(self.s * output, target, weight=self.weight)

why the output is multiplied by s(here is 30 times), just to make the loss greater? However, we didn't do this to the Focal loss

sakumashirayuki avatar Dec 01 '20 09:12 sakumashirayuki

the same question

lipingcoding avatar Jan 06 '21 05:01 lipingcoding

same question here as well

jinwon-samsung avatar Apr 22 '21 06:04 jinwon-samsung

same question here

xiangly55 avatar Sep 05 '21 05:09 xiangly55

I'm not sure if anyone found an explanation to this but I also have the same question.

tkasarla avatar Apr 10 '22 11:04 tkasarla