FD-MobileNet
FD-MobileNet copied to clipboard
inference time benchmark of different models ?
hi, what about the inference time of the FD_MobileNet ? can you give benchmark of speed about different models mentationed in the paper ? Thanks !
The inference speed reported in the paper is evaluated using a self-maintained version of ncnn, so I cannot put the code here.
If you are interested, you can initialize the models using Caffe and convert it to ncnn params. As we only care about the speed, there is no need to train the models.