stylegan2-pytorch
stylegan2-pytorch copied to clipboard
Using attention layers
I tried different values (--attn-layers [1,2,3]
) for attention mechanism, but the results are either the same or worse. Did anyone find a way to improve FID/IS scores using attention?
@AlexTS1980 Hey Alex! Made a new update today and was wondering if you would be willing to retry with the attention layers. There may have been a bug!
@lucidrains May I ask what kind of linear attention you are using in this repo, and why you follow it up with a linear layer? Also what kind of improvements did you observe when adding attention? Was it visual improvements suggestive of modelling long-range dependencies or evaluation (FID) score improvements?
+1,when I use the attention layers, I got the worse FID scores.