self-attention-gan icon indicating copy to clipboard operation
self-attention-gan copied to clipboard

Residual vs attentional blocks

Open valillon opened this issue 3 years ago • 0 comments

All generator and discriminator types implemented here are made of either block() or block_no_sn() modules, which either way have internally a residual connection x_0 + x by default. However, in the associated paper residual vs. attentional blocks are compared as if both architectures were exclusive, one or the other. So, does the attentional architecture reported in the paper includes also residual blocks or this implementation does not fully follow the reported architectures?

Thanks.

valillon avatar May 16 '21 15:05 valillon