flops-counter.pytorch
flops-counter.pytorch copied to clipboard
Additions and Multiplications count separately
Hi,
This is a great repo! :-) But can you add functionality to compute the addition count and multiplication count separately?
Thanks!
I think this makes no sense since modern computational accelerators have FMA instructions.
Multiply operations are in general an order of magnitude more expensive than additions. So, it might be very useful for people experimenting with hardware not having FMA instructions.
Multiply operations are in general an order of magnitude more expensive than additions. So, it might be very useful for people experimenting with hardware not having FMA instructions.
Do you mean mul (or conv2d) layers consume drastically more flops than bias-add/add ? SInce I ran some flops caluation in tensorflow, the flops report usually mul (or conv2d) over 90% flops, but bias-add/add only consume less than 5%.