signet-pytorch
signet-pytorch copied to clipboard
why set number_samples and running_loss to zero ?
https://github.com/VinhLoiIT/signet-pytorch/blob/36c5ee637a135255675c63d0bfbcb9aa777bba14/train.py#L37-L38
In train.py after print the result for any log_interval step why set number_samples and running_loss to zero ?
But in the eval step do not set them to 0.
https://github.com/VinhLoiIT/signet-pytorch/blob/36c5ee637a135255675c63d0bfbcb9aa777bba14/train.py#L58-L59
In the training phase, basically what we're doing in the logging line is that we compute the average loss over some iterations. It's not very accurate due to randomness of the data sampling. We'll do reset so that initial losses computation will not affect to the current running loss.
However, in the evaluation phase, there is no randomness, so we just sum all the loss and divide it once with the number of data.
I hope this will help.
In the training phase, basically what we're doing in the logging line is that we compute the average loss over some iterations. It's not very accurate due to randomness of the data sampling. We'll do reset so that initial losses computation will not affect to the current running loss.
However, in the evaluation phase, there is no randomness, so we just sum all the loss and divide it once with the number of data.
I hope this will help.
Thank you for your comment. I meant that if we remove these two lines from the train print, nothing will happen.