svm-pytorch icon indicating copy to clipboard operation
svm-pytorch copied to clipboard

printing loss, denominator fix

Open myavkat opened this issue 1 year ago • 0 comments

#4 In the for loop code loops with range(0,N,args.batchsize) which loops N/args.batchsize times. Therefore sum_loss += float(loss) is ran N/args.batchsize times. But when printing the epoch loss, sum_loss is divided by only N which is in my opinion false. The commit in this PR contains what I think it should be divided with.

myavkat avatar Aug 14 '23 22:08 myavkat