flops-counter.pytorch
flops-counter.pytorch copied to clipboard
Does it support torch.bmm() or torch.matmul()
One part of my model performs the matrix multiplication, i.e torch.bmm() / torch.matmul(), but results show the GMac of this part is 0.0, i want to know if this is because the GMac is too small or it is not support torch.bmm() or torch.matmul()?
ptflops can take into account only ops that are derived from nn.Module. Support of torch.bmm and similar operations requires complete redesign of the flops counting mechanism. I've started thinking of that but barely will do this soon.