pytorch-summary
pytorch-summary copied to clipboard
In transfer learning as fixed feature extractor, all parameters are counted as trainable parameters.
The params of layers with requires_grad=False are counted as trainable parameters.
Could you explain the situation a bit more? Did you encounter this after creating a new model or after loading from a saved checkpoint? If recreating from a save, my understanding is that the requires_grad flags are re-initialized as they are not stored in state_dict, but rather nn.parameters. https://discuss.pytorch.org/t/how-to-save-the-requires-grad-state-of-the-weights/52906
Thanks for the reply. It was a vgg16 model that I set requires_grad=False for most of the initial layers. I didn't use checkpoint.
The model worked as intended but torch summary counted non trainable layers as trainable and gave number of trainable params a huge number.
Sorry It was a while ago I encountered this, so cannot provide more info.