gan-vae-pretrained-pytorch icon indicating copy to clipboard operation
gan-vae-pretrained-pytorch copied to clipboard

Discriminator outputs suggest model isn't trained

Open gordon-lim opened this issue 3 years ago • 0 comments

May I ask what were you last outputs from your training loop? I trained using your weights but got [781/782] Loss_D: 0.0267 Loss_G: 8.2782 D(x): 0.9833 D(G(z)): 0.0072 / 0.0008 Yet from the pytorch tutorial, it says that

D(x) - the average output (across the batch) of the discriminator for the all real batch. This should start close to 1 then theoretically converge to 0.5 when G gets better. Think about why this is. D(G(z)) - average discriminator outputs for the all fake batch. The first number is before D is updated and the second number is after D is updated. These numbers should start near 0 and converge to 0.5 as G gets better. Think about why this is.

But this is not the case with your pre-trained model. Hence I would like to ask if you were aware of this and for possible explanations why.

Thank you for reading my message and I look forward to hearing from you soon.

Yours Sincerely, Gordon

gordon-lim avatar Aug 17 '21 01:08 gordon-lim