flops-counter.pytorch icon indicating copy to clipboard operation
flops-counter.pytorch copied to clipboard

Flops counter for convolutional networks in pytorch framework

Results 38 flops-counter.pytorch issues
Sort by recently updated
recently updated
newest added

Hi, Thanks for open sourcing a useful tool! I have been using this for double-checking # of MACs and it worked well for alexnet/vgg/resnet/mobilenet/efficientnet and even convnext. Now I would...

The number of parameters of each module is calculated by following code, https://github.com/sovrasov/flops-counter.pytorch/blob/5f2a45f8ff117ce5ad34a466270f4774edd73379/ptflops/pytorch_engine.py#L110-L112 I used this code on torch.nn.BatchNorm2d like this `import torch` `bn = torch.nn.BatchNorm2d(10)` `sum(p.numel() for p in...

question

Getting 4.12B flops using your code whereas almost all research papers mentioned 4.09B flops for this configuration (pytorch default 76.15% test accuracy for pretrained model) Can you please modify the...

question

Why use the GMACs?GMACs is different with GFLOPs.

question

One part of my model performs the matrix multiplication, i.e torch.bmm() / torch.matmul(), but results show the GMac of this part is 0.0, i want to know if this is...

enhancement

The GMACs are the same regardless of any frozen layers. ``` All layers are trainable: Computational complexity: 7.63 GMac Number of parameters: 128.92 M Only classifier is trainable: Computational complexity:...

question

from the last table only saw MACs and params, is that FLOPs doesn't needed a record? While some paper still report their FLOPs/G for comparasion.

question

I have fit a model with pixelshuffle operation to this flops calculation. However, it cannot calculate the flops in pixelshuffle layer. PixelShuffle(0.0 M, 0.000% Params, 0.0 GMac, 0.000% MACs, upscale_factor=2)...

question