Stand-Alone-Self-Attention icon indicating copy to clipboard operation
Stand-Alone-Self-Attention copied to clipboard

problem with unfold

Open vainaixr opened this issue 5 years ago • 1 comments

after

k_out = k_out.contiguous().view(batch, self.groups, self.out_channels // self.groups, height, width, -1)

it gives error,

RuntimeError: shape '[2, 1, 16, 34, 34, -1]' is invalid for input of size 294912

this is because of use of unfold on k_out before this, so height, width is not consistent.

vainaixr avatar Oct 12 '19 06:10 vainaixr

You should change the padding to 1, so the dimension w and h of the feature maps would keep the same after 1x1 convolusion.

class AttentionConv(nn.Module):
    def __init__(self, in_channels, out_channels, kernel_size, stride=1, padding=1, groups=1, bias=False):

kevinco27 avatar Jan 09 '20 09:01 kevinco27