RAT-GAN icon indicating copy to clipboard operation
RAT-GAN copied to clipboard

about attention

Open majun19970125 opened this issue 2 years ago • 11 comments

Hello, I would like to ask how this attention is handled and what is the code?

majun19970125 avatar Sep 18 '22 12:09 majun19970125

It is implemented in D_GET_LOGITS_att in the model.py

senmaoy avatar Sep 19 '22 08:09 senmaoy

So how does his final result show?is the effect of attention.

majun19970125 avatar Sep 19 '22 08:09 majun19970125

As shown in the paper, compared with rat, attention makes a little improvement. According to my results, it works more obviously in worse baselines or 64*64 resolution.

senmaoy avatar Sep 19 '22 09:09 senmaoy

I want to reproduce the code now, but there is no code for the display effect of the attention block. How to do it?

majun19970125 avatar Sep 19 '22 09:09 majun19970125

Thank you for you interest, i will upload the visulization code of attention and rat

senmaoy avatar Sep 19 '22 09:09 senmaoy

Can you update it now, I'm in a hurry, thank you

majun19970125 avatar Sep 19 '22 09:09 majun19970125

Wait me for about half an hour. What'more, a major contribution of this attention is sovling the model collapse caused by attention in the Discriminator. It's the first attention in D.

senmaoy avatar Sep 19 '22 09:09 senmaoy

ok, thank you

majun19970125 avatar Sep 19 '22 09:09 majun19970125

hey it's here: https://github.com/senmaoy/RAT-GAN/blob/main/code/main_visulize_of_attenion

senmaoy avatar Sep 19 '22 09:09 senmaoy

This code is ugly and you may propably debug by yourself. I will refine it later

senmaoy avatar Sep 19 '22 09:09 senmaoy

Sorry, I find that my spatial attention module is still unstable. Falling back to the baseline discriminator usually leads to better performance

senmaoy avatar Jun 08 '23 05:06 senmaoy