DCLGAN icon indicating copy to clipboard operation
DCLGAN copied to clipboard

using self.loss_G to backpropagation both G_A and G_B

Open kk2487 opened this issue 2 years ago • 3 comments

Hello, in dcl_model.py

why can you use self.loss_G to do backpropagation with both G_A and G_B ? Is there any special way to handle this?

kk2487 avatar Feb 14 '22 06:02 kk2487

Hi kk2487, Thanks for your questions.

Parameters of G_A and G_B are chained together in the optimizer. loss_G calculates both loss of G_A and G_B. Thus they can be backpropagate together and update the paramerters in one go.

See line 103 (optimizer) and line 202-233 (G_loss) for details.

JunlinHan avatar Feb 15 '22 04:02 JunlinHan

thanks for your response.

I have another question.

loss_G is summed through loss_G_A and loss_G_B. In the original design, Parameters of G_A and G_B are chained together and backpropagate using the same loss (loss_G )

Should G_A backpropagate with loss_G_A and G_B backpropagate with loss_G_B and update the paramerters separately?

kk2487 avatar Feb 15 '22 09:02 kk2487

Should G_A backpropagate with loss_G_A and G_B backpropagate with loss_G_B and update the paramerters separately?

Yes, the parameters should be updated separately. Here the implementation is actually identical. ( pytorch automatically matches the loss and corresponding parameters).

JunlinHan avatar Feb 15 '22 11:02 JunlinHan