Self-Attention-GAN-Tensorflow icon indicating copy to clipboard operation
Self-Attention-GAN-Tensorflow copied to clipboard

Some problems were encountered when migrating google attention to other codes

Open Halleyawoo opened this issue 4 years ago • 3 comments

Hello author! Thanks for your code! In this function: def hw_flatten(x) : return tf.reshape(x, shape=[x.shape[0], -1, x.shape[-1]]) my x.shape[0] is None, how can I change the code?

Halleyawoo avatar Dec 31 '20 08:12 Halleyawoo

解决方法: tf.reshape(max_pool, [batch_num, -1])改为tf.layers.flatten(max_pool)

zhangweiweicpp avatar Jan 13 '22 15:01 zhangweiweicpp

Thank you and I'll reply to your e-mail ASAP.Best regards!                                               HM.G/Donald Kam

DonaldKam avatar Jan 13 '22 15:01 DonaldKam

In the file SAGAN.py, 117 line code: batch_size, height, width, num_channels = x.get_shape().as_list() not get the batch_size number ,it get the string "?". the after code use this lead to error,we can give the value batch_size directly.

zhangweiweicpp avatar Jan 13 '22 16:01 zhangweiweicpp