pytorch-summary icon indicating copy to clipboard operation
pytorch-summary copied to clipboard

In transfer learning as fixed feature extractor, all parameters are counted as trainable parameters.

Open brpy opened this issue 4 years ago • 2 comments

The params of layers with requires_grad=False are counted as trainable parameters.

brpy avatar Nov 21 '20 07:11 brpy

Could you explain the situation a bit more? Did you encounter this after creating a new model or after loading from a saved checkpoint? If recreating from a save, my understanding is that the requires_grad flags are re-initialized as they are not stored in state_dict, but rather nn.parameters. https://discuss.pytorch.org/t/how-to-save-the-requires-grad-state-of-the-weights/52906

esherman9 avatar Mar 03 '21 14:03 esherman9

Thanks for the reply. It was a vgg16 model that I set requires_grad=False for most of the initial layers. I didn't use checkpoint.

The model worked as intended but torch summary counted non trainable layers as trainable and gave number of trainable params a huge number.

Sorry It was a while ago I encountered this, so cannot provide more info.

brpy avatar Mar 03 '21 15:03 brpy