edgetpu
edgetpu copied to clipboard
MobileNetV2 is slow on EdgeTPU
Description
Hi,
I am trying to run MobileNetV2 on the Edge TPU with a Dev Board Mini. I follow the instructions and run the classification example code on my board, but I only get around 20 ms per inference. I am wondering how I could reach 2.6 ms per inference speed as the benchmark table reports.
The tflite runtime version is 14 and the edge tpu compiler version is 16 on my dev board mini.
Thanks!
Click to expand!
Issue Type
Performance
Operating System
Mendel Linux
Coral Device
Dev Board Mini
Other Devices
No response
Programming Language
Python 3.7
Relevant Log Output
mendel@mocha-horse:~/coral/pycoral$ python3 examples/classify_image.py \
> --model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite \
> --labels test_data/inat_bird_labels.txt \
> --input test_data/parrot.jpg
----INFERENCE TIME----
Note: The first inference on Edge TPU is slow because it includes loading the model into Edge TPU memory.
164.7ms
19.3ms
19.0ms
19.0ms
19.1ms
-------RESULTS--------
Ara: 0.75781