FCOS.Pytorch
FCOS.Pytorch copied to clipboard
why here it has frozen bn twice?
in FCOS.Pytorch/model/fcos.py
def freeze_bn(module):
if isinstance(module,nn.BatchNorm2d):
module.eval()
classname = module.__class__.__name__
if classname.find('BatchNorm') != -1:
for p in module.parameters(): p.requires_grad=False
Since module.eval() has frozen bn, why you additionally set p.requires_grad=False
? Is there another module called BatchNorm*?
eval() is not equivalent to requires_grad.
eval() is not equivalent to requires_grad.
Thank you for your early reply. I have learned that turn off the requires_grad
can accelerate this module, but when we do inference, we always add with torch.no_grad
to do the same thing, as it is in line 153 of eval.py. So does it means that here we turn off requires_grad
in freeze_bn
is for some situations in the training process, or just to make the code robust?
I`m learning pytorch with this code, so some questions may be stupid, please forgive me.