prio1988
Results
3
comments of
prio1988
Hi Hai, may I ask you why have you used the nll loss and not the cross entropy loss in the training? Thanks
Nll loss assume that you have already applied a logSoftMax layer on the top of your network. The multi class cross entropy loss is the torch.nn.CrossEntropyLoss. I think that probably...
If you use the crossEntropyLoss you can avoid also the softmax. It is done internally by the loss.