Explaining-In-Style-Reproducibility-Study icon indicating copy to clipboard operation
Explaining-In-Style-Reproducibility-Study copied to clipboard

KL divergence NaN issue

Open harshvardhan96 opened this issue 3 years ago • 0 comments

Hi,

During the GAN training I'm facing an issue. The KL divergence loss between real image logits and fake image logits turns out to be NaN after a a couple of steps. I'm printing the KL divergence after every step, and I found in the first step itself, the KL divergence is huge, and in the following step it becomes negative.

After Step 1: G: 23949.90 | D: 30.51 | GP: 11.63 | Rec: 4.70 | KL: 2703973204759576721869805868380323840.00

Any idea why this is the case ? @NoahVl

harshvardhan96 avatar Aug 03 '22 02:08 harshvardhan96