Shift-Net_pytorch
Shift-Net_pytorch copied to clipboard
some questions about variants
Hello, Mr. Yan.When I read your code, I found that you created several variants on the basis of the original, which really made me admire. But I don't understand the difference between them and shift-net. Could you briefly introduce them?Thank you
PatchSoft means shift_sz
not only be 1*1, but also can be 3*3, etc. It is equivalent to contextual attention.
res_xx
means that I do not directly concat the shifted feature as the third part, however, takes it into a residual block, then the block outputs the residual information of the decoder feature. It can be regarded as a more elegant of information fusion.
soft_shift_net
seems wrong, it shares the same forward as shift-net
, however, the backward of it has not be implemented correctly. I forget why I write it, will delete this variant when I am free.
Thank you very much for your explanation.