WGAN-tensorflow
WGAN-tensorflow copied to clipboard
question about high level function
you did ly.conv2d_transpose(train, 128, 3, stride=2, activation_fn=tf.nn.relu, normalizer_fn=ly.batch_norm, padding='SAME', weights_initializer=tf.random_normal_initializer(0, 0.02)) I wonder if input variable is_training for function ly.batch_norm is always set to False. And if so does batch_norm really work in this case?
Thank you for pointing it out. The training mode and inference mode differs only on the statistic used in normalizing, I guess the effect would not be surprisingly beneficial. I have added the is_training
flag on WGAN.ipynb
.