tutorials icon indicating copy to clipboard operation
tutorials copied to clipboard

[BUG] - Loss calculation problem in train-loop

Open jaggernaut007 opened this issue 10 months ago • 2 comments

Add Link

https://pytorch.org/tutorials/beginner/basics/optimization_tutorial.html

Describe the bug

The loss calculation seems to be wrong. Should we divide the total loss by the number of values of X rather than the number of batches?

The optimiser also seems wrong.

Describe your environment

Running on Google Colab

cc @subramen @albanD

jaggernaut007 avatar Mar 26 '24 22:03 jaggernaut007

Hi @jaggernaut007, are you referring to this line? If that's the case, the loss within the testing loop gives you a single scalar value per batch, and that's what you are accumulating in test_loss. That's why you need to divide after the loop by the number of batches, num_batches.

jmarintur avatar Mar 29 '24 09:03 jmarintur

@svekars unless I am wrong, I think we can close this issue.

jmarintur avatar Apr 17 '24 05:04 jmarintur