faster_rcnn_pytorch
faster_rcnn_pytorch copied to clipboard
out of memory if don`t fix VGG16 param
If dont fix pretrained VGG16s parameters , the model will increasing untill memory booms.
But if fix these parameters , it works.
Why?
code in train.py
for param in net.rpn.features.parameters():
param.requires_grad = False
谢谢