stylegan2-pytorch icon indicating copy to clipboard operation
stylegan2-pytorch copied to clipboard

Using attention layers

Open AlexTS1980 opened this issue 3 years ago • 3 comments

I tried different values (--attn-layers [1,2,3] ) for attention mechanism, but the results are either the same or worse. Did anyone find a way to improve FID/IS scores using attention?

AlexTS1980 avatar Aug 17 '20 18:08 AlexTS1980

@AlexTS1980 Hey Alex! Made a new update today and was wondering if you would be willing to retry with the attention layers. There may have been a bug!

lucidrains avatar Oct 05 '20 05:10 lucidrains

@lucidrains May I ask what kind of linear attention you are using in this repo, and why you follow it up with a linear layer? Also what kind of improvements did you observe when adding attention? Was it visual improvements suggestive of modelling long-range dependencies or evaluation (FID) score improvements?

Dhruva-Storz avatar May 13 '21 19:05 Dhruva-Storz

+1,when I use the attention layers, I got the worse FID scores.

sunpeng1996 avatar Sep 07 '21 03:09 sunpeng1996