dcgan.torch
dcgan.torch copied to clipboard
Not resetting gradParameters
Hello,
In main.lua
, while training discriminator in function fDx
, why aren't gradParameters
reset to zero after the forward pass with real images and before the one with fake images? I think it should matter. Would like to know your thoughts.
If gradParameters
are reset to 0 after the first forward pass then this forward pass becomes useless because you lose the gradients you have computed. The backward
method calls the accGradParameters
which accumulate the gradients (it adds them to the gradParameters
tensor). Since you want to train on both real and fake examples you need to call this method twice without resetting the gradParameters
tensor before the second call.
At the end the gradParameters
tensor that is returned contains the sum of the gradients with respect to the real and the fake data.