PSConv icon indicating copy to clipboard operation
PSConv copied to clipboard

Can't simply replace nn.Conv2d with PSConv2d

Open quantumsquirrel opened this issue 4 years ago • 2 comments

Thanks to the compact characteristic of PSConv, just replace nn.Conv2d with PSConv2d. Note that there exists another hyperparameter named parts you may set in our PSConv operator.

Originally posted by @d-li14 in https://github.com/d-li14/PSConv/issues/3#issuecomment-659709139

Hi, I'm reading your paper and there comes some problems, hope you could help me figer it out! Origin Conv2d was Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False) PSConv2D are PSGConv2d( (gwconv): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), groups=4, bias=False) (gwconv_shift): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(2, 2), dilation=(2, 2), groups=4, bias=False) (conv): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False) ) And I add self.weight = self.conv.weight self.bias = self.conv.bias To solve the problem " 'PSGConv2d' object has no attribute 'weight' " during building Resnet, I don't know I'm right or wrong but it works temporarily. After that ,durning training part,in PSConv2d.forward self.gwconv(x).shape: torch.Size([6, 256, 202, 274]) self.conv(x).shape torch.Size([6, 256, 202, 274]) but x_shift.shape: torch.Size([6, 256, 204, 276])

So....The size of tensor a (274) must match the size of tensor b (276) at non-singleton dimension 3

I'm using FCOS original code from tianzhi0549,thanks for the contribution, and add "psconv.py | conv_module.py | conv_ws.py " norm.py" from this respository.

Thank you for your time!

quantumsquirrel avatar Oct 20 '20 12:10 quantumsquirrel

I think there is few mistake in the code, try to change 'dilation' to 'padding'! image

zhangyuxuan1996 avatar Mar 04 '21 08:03 zhangyuxuan1996

I think there is few mistake in the code, try to change 'dilation' to 'padding'! image

你好,请问图片中的这段核心代码怎么理解呢?方便解释一下吗? image

zyxjtu avatar Jan 10 '22 11:01 zyxjtu