taming-transformers icon indicating copy to clipboard operation
taming-transformers copied to clipboard

Discriminator loss remains unchanged during training

Open tearscoco opened this issue 4 years ago • 6 comments
trafficstars

Great work!

I am working on my own dataset recently. During training, I found two odd things about the loss. I really appreciate the guidance of you if you've had the same problems before.

a. I fit in my own dataset, the whole process runs well except that the D loss during training remains 1! I followed the same procedure where Discriminator started after several epochs. Seems that D losses its ability to distinguish the real and fake. I decrease the number of pre-running epochs but ended in the same result.

b. I tried to exclude the D loss and keep the perceptual loss. The reconstructed results seem fine, except that within the complex-pattern area there exists some blocking effect noise. I wonder whether you guys ran into the same odd before.

All in all, I really think this piece of work is a big step toward better text-image generation.

tearscoco avatar Feb 20 '21 09:02 tearscoco

Would you be willing to share your setup, for training this on a custom dataset? I'm trying to go through this myself, only started, and would welcome some advice.

TheodoreGalanos avatar Mar 16 '21 14:03 TheodoreGalanos

Nevermind, it was actually quite easy to do!

TheodoreGalanos avatar Mar 16 '21 15:03 TheodoreGalanos

@tearscoco Did you somehow solve the issue? I've encountered the same problem (D loss during training remains 1)

nashory avatar Apr 16 '21 01:04 nashory

any updates on this?

nashory avatar Apr 26 '21 03:04 nashory

any updates on this?

Zijian007 avatar Mar 09 '23 02:03 Zijian007