flops-counter.pytorch
flops-counter.pytorch copied to clipboard
how to calculate the flops for pixelshuffle operation?
I have fit a model with pixelshuffle operation to this flops calculation. However, it cannot calculate the flops in pixelshuffle layer.
PixelShuffle(0.0 M, 0.000% Params, 0.0 GMac, 0.000% MACs, upscale_factor=2)
Can anyone help?
Hi sovrasov , may I know that would pixel shuffle consume floating point operation? As I cannot find any equation or information is about the flops for pixel shuffle, I wonder if this layer is a floating point operation or not? Thanks.
I've reviewed the original paper that introduced PixelShuffle and agree that it's rather a memory rearrangement operator, so the current behavior is correct.
I've reviewed the original paper that introduced PixelShuffle and agree that it's rather a memory rearrangement operator, so the current behavior is correct.
Thanks for the clarifcation. Is that the memory rearrangement operator will not be considered as floating point operation? Is that the memory rearrangement operator still consume computational power and time?
Yep, memory rearrangement always consumes time, but it is out of scope for flops metric. I'd rather keep the straightforward flops definition to avoid misunderstanding. For such operations like PixelShuffle or even ReLU we should think about a different metric.
Yep, memory rearrangement always consumes time, but it is out of scope for flops metric. I'd rather keep the straightforward flops definition to avoid misunderstanding. For such operations like PixelShuffle or even ReLU we should think about a different metric.
Thanks. As you said there is a paper that introduced PixelShuffle, may I have the name of it?
https://arxiv.org/abs/1609.05158
https://arxiv.org/abs/1609.05158
Thanks.