ResidualAttentionNetwork-pytorch icon indicating copy to clipboard operation
ResidualAttentionNetwork-pytorch copied to clipboard

about the code "out_interp = self.interpolation1(out_middle_2r_blocks) + out_down_residual_blocks1"

Open ZachZou-logs opened this issue 5 years ago • 4 comments

hello ,thank you for your code! But I have a question about your code.The episode in your code seems to be no such operation in the paper and in the soft mask branch only skip connection have addition operation.Could you help me solve this question? out_interp = self.interpolation1(out_middle_2r_blocks) + out_down_residual_blocks1

ZachZou-logs avatar Aug 27 '19 07:08 ZachZou-logs

I refer to the caffe version. u can consider it as a trick.

tengshaofeng avatar Sep 16 '19 12:09 tengshaofeng

hello dude, now I have the same question too, have you solved it yet now? If so, could you share the experience with me?

daijiahai avatar Nov 04 '19 08:11 daijiahai

hello dude, now I have the same question too, have you solved it yet now? If so, could you share the experience with me?

I change to another model, it works

jorie-peng avatar Jan 28 '21 11:01 jorie-peng

I refer to the caffe version. u can consider it as a trick.

hi, thanks for your code, when I use 'ResidualAttentionModel_92_32input_update', I will have question as this issus described because of 'AttentionModule_stage1_cifar', but change model to 'ResidualAttentionModel_92', it work, but I cannot pretrained gived model because of lots of mismatch, do you have a good way to load gived model? or is any other pretrained model can use? Thanks!

jorie-peng avatar Jan 28 '21 12:01 jorie-peng