Non-local_pytorch icon indicating copy to clipboard operation
Non-local_pytorch copied to clipboard

problem of function W initialization

Open djy-tsinghua opened this issue 4 years ago • 2 comments

I think you want to initialize self.W as zero, so that the residual path won't affect the pre-trained model, but I can not figure out why you initialize self.W[1] rather than self.W[0] when using bn layer? initialization problem

djy-tsinghua avatar Jun 26 '20 12:06 djy-tsinghua

Hi @djy-tsinghua, maybe you can find the answer in this issue https://github.com/AlexHex7/Non-local_pytorch/issues/1.

AlexHex7 avatar Jun 28 '20 02:06 AlexHex7

Thank you very much!@AlexHex7

djy-tsinghua avatar Jun 28 '20 03:06 djy-tsinghua