faster-rcnn.pytorch icon indicating copy to clipboard operation
faster-rcnn.pytorch copied to clipboard

why we should fix some specific layer in ResNet?

Open ZongxianLee opened this issue 6 years ago • 2 comments

I am confused about the line 250 in lib/model/faster_rcnn/resnet.py why we should fix some layer in ResNet? # Fix blocks for p in self.RCNN_base[0].parameters(): p.requires_grad=False for p in self.RCNN_base[1].parameters(): p.requires_grad=False

Shouldn’t these layers be involved in training jointly?

ZongxianLee avatar Aug 26 '19 02:08 ZongxianLee

Even I have the same question, did you find an answer to it? If so, could you please share the same?

jiteshm17 avatar Mar 24 '20 12:03 jiteshm17

Because we are using pretrained weights(resnet101_caffe.pth), so in essence we are doing transfer learning, hence it means that all layers have already been trained on imagenet and now we want to train our model on COCO, VOC or custom dataset. So we fix or freeze earlier layers which only detect non specific information and unfreeze later layers so that to train these ones on our new dataset. In this way we minimize the training time

TonojiKiobya avatar Dec 17 '21 06:12 TonojiKiobya