NLNL-Negative-Learning-for-Noisy-Labels
NLNL-Negative-Learning-for-Noisy-Labels copied to clipboard
Implementation Problem for NLNL loss
Hi,
Thanks for this great implementation. When I ran this code, I found that the implementation of the NLNL loss might exist a problem that will cause loss to be Nan. It might be caused by this line: https://github.com/ydkim1293/NLNL-Negative-Learning-for-Noisy-Labels/blob/921dec436184d116725fcfa1197df2806cae0ab6/main_NL.py#L193
This line will make all labels be -100. Would you mind checking this issue? Best Regards, Hongxin
Hi, the definition of cross entropy loss in Pytorch package has set the defalt of ignore_index as -100. So labels with value of -100 would be ignored in the final loss calculation.
def cross_entropy( input: Tensor, target: Tensor, weight: Optional[Tensor] = None, size_average: Optional[bool] = None, ignore_index: int = -100, reduce: Optional[bool] = None, reduction: str = "mean", label_smoothing: float = 0.0, )
I see. but why would the loss be Nan?
I guess it's probably the PyTorch version. I have no more idea about this issue~