Sheng-Yeh Chen
Sheng-Yeh Chen
**torch.nn.CrossEntropyLoss()** combines _LogSoftMax_ and _NLLLoss_ in one single class. Thus, there should be only **_"return out"_** instead of return **_"F.log_softmax(out)"_** in the _forward()_ function of some classification models.
The code in [pytorch/vision/train.py](https://github.com/cs230-stanford/cs230-code-examples/blob/master/pytorch/vision/train.py#L70-L90) and [pytorch/vision/evaluate.py](https://github.com/cs230-stanford/cs230-code-examples/blob/master/pytorch/vision/evaluate.py#L55-L67) describe how to calculate metrics with batches of data. In train.py, since it calculates the metrics once in a while, it doesn't represent the...
Here is the implementation of the original authors. http://val.serc.iisc.ernet.in/DeepFuseICCV17/