flops-counter.pytorch
flops-counter.pytorch copied to clipboard
Flops counter for convolutional networks in pytorch framework
Hi, Thanks for open sourcing a useful tool! I have been using this for double-checking # of MACs and it worked well for alexnet/vgg/resnet/mobilenet/efficientnet and even convnext. Now I would...
The number of parameters of each module is calculated by following code, https://github.com/sovrasov/flops-counter.pytorch/blob/5f2a45f8ff117ce5ad34a466270f4774edd73379/ptflops/pytorch_engine.py#L110-L112 I used this code on torch.nn.BatchNorm2d like this `import torch` `bn = torch.nn.BatchNorm2d(10)` `sum(p.numel() for p in...
Getting 4.12B flops using your code whereas almost all research papers mentioned 4.09B flops for this configuration (pytorch default 76.15% test accuracy for pretrained model) Can you please modify the...
Why use the GMACs?GMACs is different with GFLOPs.
One part of my model performs the matrix multiplication, i.e torch.bmm() / torch.matmul(), but results show the GMac of this part is 0.0, i want to know if this is...
The GMACs are the same regardless of any frozen layers. ``` All layers are trainable: Computational complexity: 7.63 GMac Number of parameters: 128.92 M Only classifier is trainable: Computational complexity:...
from the last table only saw MACs and params, is that FLOPs doesn't needed a record? While some paper still report their FLOPs/G for comparasion.
I have fit a model with pixelshuffle operation to this flops calculation. However, it cannot calculate the flops in pixelshuffle layer. PixelShuffle(0.0 M, 0.000% Params, 0.0 GMac, 0.000% MACs, upscale_factor=2)...