PyTorch-progressive_growing_of_gans icon indicating copy to clipboard operation
PyTorch-progressive_growing_of_gans copied to clipboard

batch sizes don't match paper

Open jcpeterson opened this issue 6 years ago • 3 comments

Why are the default batch sizes used? The original paper uses 16 for sizes 4x4 to 128x128, which should be faster (overall) than what is currently used.

jcpeterson avatar Nov 22 '17 19:11 jcpeterson

I did not use the same batch size as the paper used cause I ran the code on 1080 GPU with only 8GB memory, not P100, which has 16GB memory. Besides, the dataset is also different, I used CelebA dataset(with cropped and aligned), and now I'm switching to CelebA-HQ.

github-pengge avatar Nov 23 '17 13:11 github-pengge

I see. I also noticed the learning rate seems a bit fast to use for both the G and D. Any reason why?

jcpeterson avatar Nov 25 '17 03:11 jcpeterson

I found that learning rate was not fixed in official code, and I'm changing the scheduler of learning rate now.

github-pengge avatar Nov 25 '17 03:11 github-pengge