torchinfo icon indicating copy to clipboard operation
torchinfo copied to clipboard

One complex parameter should count as two params

Open scaomath opened this issue 3 years ago • 2 comments

As all models' parameters counting traces back here https://github.com/TylerYep/torchinfo/blob/8b3ae72c7cac677176f37450ee27b8c860f803cd/torchinfo/layer_info.py#L154-L170

there is no checking on whether the parameter tensor is complex or real. If a parameter is complex, such as a + i b, then it represents actually two parameters (for counting MACs/FLOPs purpose).

Of course this PR might not be conforming with torchinfo's dev, feel free to close it, I hope complex would be considered in next version.

scaomath avatar Nov 09 '22 21:11 scaomath

Thanks for looking into this. Could you provide a small example model + output to illustrate your point? That will help validate this fix works

TylerYep avatar Nov 09 '22 21:11 TylerYep

Thanks for looking into this. Could you provide a small example model + output to illustrate your point? That will help validate this fix works

I added a test. Complex layer correctly returns the double params count. I encounter this issue when running torchinfo for network like FNO: zongyi-li/fourier_neural_operator#52 I thought it would be nice to integrate this to one of my all time fav add-on for torch.

Thoughts? @TylerYep

scaomath avatar Nov 09 '22 22:11 scaomath