flops-counter.pytorch icon indicating copy to clipboard operation
flops-counter.pytorch copied to clipboard

What is the effect of frozen layers on GMACs?

Open ahmadmobeen opened this issue 4 years ago • 1 comments

The GMACs are the same regardless of any frozen layers.

All layers are trainable:
Computational complexity:       7.63 GMac
Number of parameters:           128.92 M

Only classifier is trainable:
Computational complexity:       7.63 GMac
Number of parameters:           155.69 k

In my understanding, if "param.requires_grad' is set to 'False' in some of the layers, those layers would not be computed however they would remain part of the graph.

So, in the calculation of GMACs, such layers should be excluded as they would not be computed during the training hence reducing the number of operations?

Please correct me if my understanding is wrong.

ahmadmobeen avatar Jul 26 '21 05:07 ahmadmobeen

requires_grad prevents pytorch from computing gradients for particular parameters during training. This flag doesn't affect forward pass complexity, which is measured by ptflops. See pytorch docs for datails.

sovrasov avatar Jul 26 '21 13:07 sovrasov