CondenseNet icon indicating copy to clipboard operation
CondenseNet copied to clipboard

about the inference time

Open NHZlX opened this issue 7 years ago • 0 comments

Hi, i see the benchmark from README. I have some questions. What's the platform used for inference time testing? It's there any neon acceleration for depthwise conv in mobilenet? There is a great difference between theoretical acceleration and actual acceleration. Although the amount of computation of mobilenet is twice as much as that of condensenet, I still want to know the speed difference after specific optimization.

Inference time on ARM platform

Model FLOPs Top-1 Time(s)
VGG-16 15,300M 28.5 354
ResNet-18 1,818M 30.2 8.14
1.0 MobileNet-224 569M 29.4 1.96
CondenseNet-74 (C=G=4) 529M 26.2 1.89
CondenseNet-74 (C=G=8) 274M 29.0 0.99

NHZlX avatar Dec 19 '17 08:12 NHZlX