cnn-text-classification-pytorch
cnn-text-classification-pytorch copied to clipboard
F.softmax(logits) and CrossEntropyLoss() is right?
torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=-100, reduce=None, reduction='mean')[SOURCE] This criterion combines nn.LogSoftmax() and nn.NLLLoss() in one single class. I guess F.softmax(logits) and CrossEntropyLoss() used together is wrong?