External-Attention-pytorch
External-Attention-pytorch copied to clipboard
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
1. padding is set to "same" because of the dialation, otherwise `res=res.expand_as(x)` in line 49 will fail. 2. removed unnecessary `b, c, _, _ = x.size()` since it's never used....
如果是mmdetection框架该如何使用这个库呢
receptive_field_size = tensor[0][0].numel() IndexError index 0 is out of bounds for dimension 0
报错内容类似于: RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same 或者 Expected all tensors to be on the same device, but found at least two devices, cuda:0...
当输入为两个方向的图片,请问有没有利用计算相似度矩阵作为权重,把两个方向特征图进行融合的模块或者方法?谢谢!
 请问一下如图在使用SEA出现的问题应该怎么解决呢
I notice the code in BAM is : ` def forward(self, x): b, c, _, _ = x.size() sa_out=self.sa(x) ca_out=self.ca(x) weight = self.sigmoid(sa_out*ca_out) weight=self.sigmoid(sa_out+ca_out) # here out=(1+weight)*x return out` here...
Can you add some comments in the code to explain the meaning of each parameter, which will be more friendly to novices, thank you. For example, what do d_model and...
打扰一下,我正在做风电数据的故障诊断,数据是100000*80,我想利用1D-CNN来进行实验,但是我发现关于卷积常用的ECA,SE注意力机制,一般都是处理4维数据,也就是处理图像数据。冒昧地问一下,ECA或者SE注意力机制可以处理1D-CNN吗,期待您的回复
单运行./model/attention/SKAttention.py时会报错,RuntimeError: Can't call numpy() on Tensor that requires grad. Use tensor.detach().numpy() instead. 目测应该是第40行U=sum(conv_outs)中调用sum时需要将tensor转numpy,而被转tensor参与梯度计算不能直接转?