FCOS.Pytorch icon indicating copy to clipboard operation
FCOS.Pytorch copied to clipboard

why here it has frozen bn twice?

Open rainylt opened this issue 4 years ago • 2 comments

in FCOS.Pytorch/model/fcos.py

        def freeze_bn(module):
            if isinstance(module,nn.BatchNorm2d):
                module.eval()
            classname = module.__class__.__name__
            if classname.find('BatchNorm') != -1:
                for p in module.parameters(): p.requires_grad=False

Since module.eval() has frozen bn, why you additionally set p.requires_grad=False? Is there another module called BatchNorm*?

rainylt avatar Oct 12 '20 14:10 rainylt

eval() is not equivalent to requires_grad.

VectXmy avatar Oct 12 '20 14:10 VectXmy

eval() is not equivalent to requires_grad.

Thank you for your early reply. I have learned that turn off the requires_grad can accelerate this module, but when we do inference, we always add with torch.no_grad to do the same thing, as it is in line 153 of eval.py. So does it means that here we turn off requires_grad in freeze_bn is for some situations in the training process, or just to make the code robust?

I`m learning pytorch with this code, so some questions may be stupid, please forgive me.

rainylt avatar Oct 13 '20 03:10 rainylt