generative-adversial icon indicating copy to clipboard operation
generative-adversial copied to clipboard

Why normalizing the input doesn't work

Open ZimuW opened this issue 7 years ago • 2 comments

Hi, After testing the cifar-10 dataset, I noticed that if the data-batch is divided by 255.0 before feeding into the network, it doesn't converge. Is there a specific reason for this?

When training with celeba dataset without normalizing (dividing by 255.0), it won't converge as well.

Have you tried celeba dataset with this network? And did you run into any of those problems?

Thanks!

ZimuW avatar Mar 22 '17 20:03 ZimuW

ive seen many successful examples of rescaling cifar 10 to 0-1 so that is strange. how many epochs did you run?

i havent tried it with cifar 10, but zca whitening is often used for image classification.

mynameisvinn avatar Jul 25 '17 23:07 mynameisvinn

I think the reason is that this model is prone to model collapse because it's like the most basic one. WGAN converges with much higher probability

ZimuW avatar Aug 26 '17 04:08 ZimuW