KPConv-PyTorch
KPConv-PyTorch copied to clipboard
Could anyone calculate the Flops based on ModelNet40 classification?
I tried THOP but it always report this: [WARN] Cannot find rule for <class 'models.blocks.ResnetBottleneckBlock'>. Treat it as zero Macs and zero Params. [WARN] Cannot find rule for <class 'torch.nn.modules.linear.Identity'>. Treat it as zero Macs and zero Params. [WARN] Cannot find rule for <class 'models.blocks.GlobalAverageBlock'>. Treat it as zero Macs and zero Params. [WARN] Cannot find rule for <class 'torch.nn.modules.container.ModuleList'>. Treat it as zero Macs and zero Params. [WARN] Cannot find rule for <class 'torch.nn.modules.loss.CrossEntropyLoss'>. Treat it as zero Macs and zero Params. [WARN] Cannot find rule for <class 'torch.nn.modules.loss.L1Loss'>. Treat it as zero Macs and zero Params. [WARN] Cannot find rule for <class 'models.architectures.KPCNN'>. Treat it as zero Macs and zero Params.
so when I test with 1024 points,the results was 244.51M,and the number is incorrect obviously.
I don't know THOP, maybe it requires that you define your own rules for custom layers?
I am sorry I can not help more on this matter, I have never tried to calculate the Flops. Also, you should note that the number of points per batch is variable, so that means Flops are also variable. The variable batch size strategy aims at reducing the variations but, it still varies a little.
Thank you~
Have you finished calculating FLOPs for KPConv?
Thank you~