Self-Attention-GAN-Tensorflow icon indicating copy to clipboard operation
Self-Attention-GAN-Tensorflow copied to clipboard

Simple Tensorflow implementation of "Self-Attention Generative Adversarial Networks" (SAGAN)

Results 16 Self-Attention-GAN-Tensorflow issues
Sort by recently updated
recently updated
newest added

In my experiments the attention maps are not showing anything meaningful. I'm also not sure how you would do it with the attention module in the middle.

https://github.com/taki0112/Self-Attention-GAN-Tensorflow/blob/d5237658881663103fc4d05cd86fa5b590fde0c9/SAGAN.py#L183 ![image](https://user-images.githubusercontent.com/35527568/45881689-806fcc80-bdde-11e8-9936-1cd7022682db.png) The transposed matrix, f(x).T, should be in the front position, according to the paper. https://github.com/taki0112/Self-Attention-GAN-Tensorflow/blob/d5237658881663103fc4d05cd86fa5b590fde0c9/ops.py#L93 It seems your want to reshape the tensor. x.shape[0] means batchsize and x.shape[-1]...

Hi, I noticed the excellent results for imagenet dataset on your GITHUB page. These results look similar to ones in the paper. Are these results generated from your model? If...

![image](https://user-images.githubusercontent.com/35527568/43630992-776b501c-9734-11e8-89a0-87e415b4f04b.png) i find citation 13,16,30 and do not know exact principle of hinge loss. i feel confused about why don't we use WGAN loss function. cause it has better performance...

Hi, I have some questions 1.The results are shown in this project from training stage? because you just take same data with training when running testing? 2. Could you also...

When I train my custom dataset with 940\*940\*3 pictures, it occurs the following error. However, when i resize pictures to 64\*64\*3, it works. By the way, I know the default...