M. Y. Katrancı
M. Y. Katrancı
#4 In the for loop code loops with range(0,N,args.batchsize) which loops N/args.batchsize times. Therefore sum_loss += float(loss) is ran N/args.batchsize times. But when printing the epoch loss, sum_loss is divided...
While printing the loss of the epoch in the train function, sum_loss is divided by N which is the dataset's size. But the losses are added to sum_loss N/args.batchsize times.
https://tatsuhiro-t.github.io/wslay/tutorial.html#c.communicate I am not sure if this is a typo or it is the intended return but in the paragraph communicate in tutorial page it says returns 0 if it...