dcgan.torch icon indicating copy to clipboard operation
dcgan.torch copied to clipboard

Not resetting gradParameters

Open ibrahim5253 opened this issue 7 years ago • 1 comments

Hello,

In main.lua, while training discriminator in function fDx, why aren't gradParameters reset to zero after the forward pass with real images and before the one with fake images? I think it should matter. Would like to know your thoughts.

ibrahim5253 avatar Apr 01 '17 15:04 ibrahim5253

If gradParameters are reset to 0 after the first forward pass then this forward pass becomes useless because you lose the gradients you have computed. The backward method calls the accGradParameters which accumulate the gradients (it adds them to the gradParameters tensor). Since you want to train on both real and fake examples you need to call this method twice without resetting the gradParameters tensor before the second call.

At the end the gradParameters tensor that is returned contains the sum of the gradients with respect to the real and the fake data.

fonfonx avatar Apr 21 '17 14:04 fonfonx