RAT-GAN
RAT-GAN copied to clipboard
about attention
Hello, I would like to ask how this attention is handled and what is the code?
It is implemented in D_GET_LOGITS_att in the model.py
So how does his final result show?is the effect of attention.
As shown in the paper, compared with rat, attention makes a little improvement. According to my results, it works more obviously in worse baselines or 64*64 resolution.
I want to reproduce the code now, but there is no code for the display effect of the attention block. How to do it?
Thank you for you interest, i will upload the visulization code of attention and rat
Can you update it now, I'm in a hurry, thank you
Wait me for about half an hour. What'more, a major contribution of this attention is sovling the model collapse caused by attention in the Discriminator. It's the first attention in D.
ok, thank you
hey it's here: https://github.com/senmaoy/RAT-GAN/blob/main/code/main_visulize_of_attenion
This code is ugly and you may propably debug by yourself. I will refine it later
Sorry, I find that my spatial attention module is still unstable. Falling back to the baseline discriminator usually leads to better performance