External-Attention-pytorch
External-Attention-pytorch copied to clipboard
使用您打包好的程序时,是放在网络结构的forward中吗?
def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
x = self.relu(x)
x = self.maxpool(x)
x = self.layer1(x)
x = self.layer2(x)
x = self.layer3(x)
x = self.layer4(x)
se = SEAttention(channel=512, reduction=8)
x = se(x)
如果不是,应该放在哪?请大佬多多指点。
def forward(self, x): x = self.conv1(x) x = self.bn1(x) x = self.relu(x) x = self.maxpool(x) x = self.layer1(x) x = self.layer2(x) x = self.layer3(x) x = self.layer4(x) se = SEAttention(channel=512, reduction=8) x = se(x)
如果不是,应该放在哪?请大佬多多指点。
对的,就跟pytorch里面定义的结构用法一样