pytorch-ssd icon indicating copy to clipboard operation
pytorch-ssd copied to clipboard

inference speed

Open lianxxx opened this issue 6 years ago • 3 comments
trafficstars

what's the inference speed of MobileNet v2 ssd on CPU approximately? The paper says 200ms, but i tested to be around 1 sec per image. Is there any way to speed up the inference?

lianxxx avatar Apr 24 '19 01:04 lianxxx

Caffe2 runtime is faster than Pytorch, as it fuses batchnorm and conv layers. It's less than 200ms usng Caffe2 on my PC, about 1s on a 800M Hz Arm CPU.

qfgaohao avatar Apr 24 '19 02:04 qfgaohao

Is anyway to increase the speed on CPU inference? mobilenet v2 ssd in Tensorflow is also about 200ms What are the bottlenecks of speed for pytorch version? Thanks

cwlinghk avatar Oct 25 '19 09:10 cwlinghk

@cwlinghk have you got the reason?

kunalgoyal9 avatar Nov 25 '19 09:11 kunalgoyal9