Shunted-Transformer
Shunted-Transformer copied to clipboard
About FLOPs
Good work!And I wanna know how you calculate FLOPs of the model in your paper? Thanks!!!
Hi, please refer mmcv flops count https://github.com/open-mmlab/mmcv/blob/master/mmcv/cnn/utils/flops_counter.py
AFeng @.***> 于2022年4月24日周日 17:09写道:
Good work!And I wanna know how you calculate FLOPs of the model in your paper? Thanks!!!
— Reply to this email directly, view it on GitHub https://github.com/OliverRensu/Shunted-Transformer/issues/5, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANP7CCFHPL52SIGJ4XE2WJTVGUFVPANCNFSM5UF6YBBA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
I remember that the mmcv FLOPs count does not support attention calculation, which will lead to lower FLOPs. I suggest tot use fvcore
as in our UniFormer.
Yes, we need to manually calculate the flops of Attention but it is not very large with the input of 224x224.
Kunchang Li @.***> 于2022年5月28日周六 02:35写道:
I remember that the mmcv FLOPs count does not support attention calculation, which will lead to lower FLOPs. I suggest tot use fvcore as in our UniFormer https://github.com/Sense-X/UniFormer/blob/ac62a49b40b79e614501905ad2f1c19bd4dcf12e/image_classification/main.py#L258-L259 .
— Reply to this email directly, view it on GitHub https://github.com/OliverRensu/Shunted-Transformer/issues/5#issuecomment-1139928116, or unsubscribe https://github.com/notifications/unsubscribe-auth/ANP7CCCR6VQRQJWDPEGLPI3VMEIVLANCNFSM5UF6YBBA . You are receiving this because you commented.Message ID: @.***>