flops-counter.pytorch
flops-counter.pytorch copied to clipboard
Flops counter for convolutional networks in pytorch framework
please could you answer my questions Q1- can we computing flops for a model without training the model? is there any relation between flops and training? can training affect flops?...
Hey, it's a nice tool However, I am wondering whether the return of get_model_complexity_info is correct. Let's assume all calculations are in floating point. 1MACs = 2OPs MAC = Mult...
Collecting ptflops Using cached ptflops-0.7.2.1.tar.gz (14 kB) Preparing metadata (setup.py) ... done Requirement already satisfied: torch in d:\app\miniconda3\envs\mmseg\lib\site-packages (from ptflops) (2.1.2) Requirement already satisfied: filelock in d:\app\miniconda3\envs\mmseg\lib\site-packages (from torch->ptflops) (3.13.1)...
If our model has a deformable convolution conv then can we still use this library?
When I use this function to compute FLOPs in OpenPCdet framework, it didn't work well. Can I use this function this way? ``` macs, params = get_model_complexity_info(model, (1,), as_strings=True, print_per_layer_stat=False,...
If a module is passed to a sub-module, for example: ``` import torch.nn as nn import ptflops class Block(nn.Module): def __init__(self, linear_layer) -> None: super().__init__() self.linear_layer = linear_layer def forward(self,...
Wrong flops count if the model is compiled with `torch.compile`: 1. Flops of modules in `torch.nn`, for example, `nn.Linear`, `nn.Conv2d` are tripled. 2. Flops of custom modules are not counted....
I found that the hook function will not be called when calculating MultiheadAttention module with requires_grad=False, this causes the FLOPs to be 0. No errors with requires_grad=True.