Explaining-In-Style-Reproducibility-Study
Explaining-In-Style-Reproducibility-Study copied to clipboard
KL divergence NaN issue
Hi,
During the GAN training I'm facing an issue. The KL divergence loss between real image logits and fake image logits turns out to be NaN after a a couple of steps. I'm printing the KL divergence after every step, and I found in the first step itself, the KL divergence is huge, and in the following step it becomes negative.
After Step 1: G: 23949.90 | D: 30.51 | GP: 11.63 | Rec: 4.70 | KL: 2703973204759576721869805868380323840.00
Any idea why this is the case ? @NoahVl