CS-Net icon indicating copy to clipboard operation
CS-Net copied to clipboard

CS-Net (MICCAI 2019) and CS2-Net (MedIA 2020)

Results 6 CS-Net issues
Sort by recently updated
recently updated
newest added

Why is it nan when I use CS2Net to calculate it?

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation

Line 128 in train.py. Whether the return should cancel indentation?

https://insight-journal.org/midas/community/view/21

ChannelAttentionBlock函数部分的代码不能理解,还请作者答疑解惑。 困惑的代码为 affinity = torch.matmul(proj_query, proj_key) affinity_new = torch.max(affinity, -1, keepdim=True)[0].expand_as(affinity) - affinity affinity_new = self.softmax(affinity_new) 为什么要减去affinity得到affinity_new?这样做不就使得affinity低的channel获得更高的权重吗?这不就与原本的目的背道而驰了吗? 您的文章里也并没有详细说明这一个运算的意义,所以本人无法理解该部分代码。 还请作者指点迷津。