Results 2 comments of sabbasi

Hello, Thanks for your comment. I am wondering why you only have it in generator_test and discriminator_test modules. Why two different architectures are defined for the generator and discriminator? Best...

Thanks for your response. That makes sense. Because in the SAGAN paper both during training and test time, there is attention layer in both the discriminator and the generator.