Self-Attention-GAN-Tensorflow
Self-Attention-GAN-Tensorflow copied to clipboard
Some problems were encountered when migrating google attention to other codes
Hello author! Thanks for your code! In this function: def hw_flatten(x) : return tf.reshape(x, shape=[x.shape[0], -1, x.shape[-1]]) my x.shape[0] is None, how can I change the code?
解决方法: tf.reshape(max_pool, [batch_num, -1])改为tf.layers.flatten(max_pool)
Thank you and I'll reply to your e-mail ASAP.Best regards! HM.G/Donald Kam
In the file SAGAN.py, 117 line code: batch_size, height, width, num_channels = x.get_shape().as_list() not get the batch_size number ,it get the string "?". the after code use this lead to error,we can give the value batch_size directly.