style-based-gan-pytorch icon indicating copy to clipboard operation
style-based-gan-pytorch copied to clipboard

Why to use "g_runing" and "accumulate"?

Open RyanYChen opened this issue 5 years ago • 2 comments

Hi! I'm new in GAN's field. I have a question about why to use "g_runing" and "accumulate" to do a weighted-average to generator's parameter during training? And why to use "g_runing" instead of trained generator during testing?

Is this a trick during GAN's training? Or does this appear in any paper?

Thanks a lot.

RyanYChen avatar Jul 17 '19 10:07 RyanYChen

Yes, it appears in the progressive gan and style gan paper, and widely used in gan training (like biggan.) It is effective for model stability and sample quality. Using moving average of parameters is often useful for performance of neural nets.

rosinality avatar Jul 17 '19 13:07 rosinality

Yes, it appears in the progressive gan and style gan paper, and widely used in gan training (like biggan.) It is effective for model stability and sample quality. Using moving average of parameters is often useful for performance of neural nets.

Thanks a lot!

RyanYChen avatar Jul 17 '19 13:07 RyanYChen