mxnet-ssd
mxnet-ssd copied to clipboard
How should I evaluate the inference time of network?
I want to know inference time of this network like xx FPS, should I use the code in evaluate.py
or demo.py
or deploy it ,or any other way?
P.S. I found that when inferring 1200 pictures, the running time of evaluate.py
is very different from demo.py
,(14s vs 43s, why?). I already excluded visualizing time the in demo.py
Thanks a lot
I don't test it, but it seem evaluate
use batch and demo
don't.
add this code at evaluate_net.py:92 results = mod.score(eval_iter, metric, num_batch=None, batch_end_callback=mx.callback.Speedometer(batch_size, frequent=10, auto_reset=False))
The simple answer: Now You will get the forward time.
The complicated one: As far as I understand, this time includes the I/O and NMS In general, the images are fetched in different threads, and the forward is computer a-synchronously so it is a bit complicated to really get the forward time (the computed time on the GPU) Sometimes the I/O can be a bottleneck, sometimes the GPU. I've seen that changing the batch-size has some influence.
Good Luck :)
@zhreshold wondering how to estimate the SPEED like you posted in README, thanks a lot!
Simply average the per-frame time after running a large set of images. It is difficult to evaluate time precisely for each image, since it also depends on the size and number of boxes.