One complex parameter should count as two params
As all models' parameters counting traces back here https://github.com/TylerYep/torchinfo/blob/8b3ae72c7cac677176f37450ee27b8c860f803cd/torchinfo/layer_info.py#L154-L170
there is no checking on whether the parameter tensor is complex or real. If a parameter is complex, such as a + i b, then it represents actually two parameters (for counting MACs/FLOPs purpose).
Of course this PR might not be conforming with torchinfo's dev, feel free to close it, I hope complex would be considered in next version.
Thanks for looking into this. Could you provide a small example model + output to illustrate your point? That will help validate this fix works
Thanks for looking into this. Could you provide a small example model + output to illustrate your point? That will help validate this fix works
I added a test. Complex layer correctly returns the double params count. I encounter this issue when running torchinfo for network like FNO: zongyi-li/fourier_neural_operator#52 I thought it would be nice to integrate this to one of my all time fav add-on for torch.
Thoughts? @TylerYep