convolutional-handwriting-gan
convolutional-handwriting-gan copied to clipboard
About gradient balancing
There are three backward calls inside gardient balancing between generator loss & OCR loss:
- https://github.com/amzn/convolutional-handwriting-gan/blob/f7daa5045a281be23c1d20c5b74f12ffbddf69f9/models/ScrabbleGAN_baseModel.py#L354
- https://github.com/amzn/convolutional-handwriting-gan/blob/f7daa5045a281be23c1d20c5b74f12ffbddf69f9/models/ScrabbleGAN_baseModel.py#L368
- https://github.com/amzn/convolutional-handwriting-gan/blob/f7daa5045a281be23c1d20c5b74f12ffbddf69f9/models/ScrabbleGAN_baseModel.py#L374
Won't these calls accumulate the gradients during the call of optimizer.step(); I thought our objective here was to simply compute the gardient balancing terms and multiply those to the loss or could you please give overview of what's going on here inside gradient balancing incase I misunderstood something?
I just looked at the code, I think you're right and there should be self.netG.zero_grad()
between the first and second backprop. The third one is performed without gradient accumulation just so that the graph won't be retained.