models
models copied to clipboard
tflite inference accuracy problem
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [x] I am using the latest TensorFlow Model Garden release and TensorFlow 2.
- [x] I am reporting the issue to the correct repository. (Model Garden official or research directory)
- [x] I checked to make sure that this issue has not already been filed.
1. The entire URL of the file you are using
2. Describe the bug
I used the mobilenetv2-ssd model to training hand, the accuracy was great. But when I converted to tflite, the accuracy was lower then 0.1.
3. Steps to reproduce
- use export_tflite_graph_tf2.py to make pb file
- convert to tflite
converter = tf.lite.TFLiteConverter.from_saved_model("my_mobilenet_model/saved_model") # path to the SavedModel directory
tflite_model = converter.convert()
with open('model.tflite', 'wb') as f:
f.write(tflite_model)
- inference the model as thr url
4. Expected behavior
I expected .tflite model would close to the .pb model.
5. Additional context
(raytf2) pt@pt:~/Desktop/mobilenetv2_lstm_hand_tracking/hand_tracking/exported-models$ python inference_tflite.py labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05585384 0.04902783 0.04238626 0.04060924 0.03883892 0.03762564 0.03376037 0.03239408 0.03198031 0.03174883]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05173582 0.04615015 0.04215723 0.0416927 0.03777227 0.03768802 0.03518435 0.03341618 0.0313746 0.03085464]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.06328875 0.04797256 0.04517871 0.04509196 0.04159552 0.03902051 0.03178978 0.03175989 0.03090507 0.03037795]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05530125 0.05108589 0.04858315 0.04803067 0.04241174 0.04087931 0.03293553 0.03216282 0.03198099 0.03124663]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05303013 0.04958305 0.04904506 0.04827413 0.04277271 0.03806984 0.03334618 0.03272745 0.03179631 0.03035322]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05792895 0.04831615 0.04743657 0.04226294 0.04187098 0.03745279 0.03289929 0.03248784 0.0316222 0.03100973]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.06844547 0.05175066 0.04836395 0.04774806 0.04215744 0.03778329 0.03441754 0.03245634 0.03231528 0.03219658]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05703849 0.04883415 0.04861888 0.04643914 0.0429863 0.04231891 0.03429419 0.03320628 0.03275487 0.03186417]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05660558 0.05178571 0.04886103 0.04637614 0.04141521 0.03879005 0.03428188 0.03371784 0.03368044 0.03280622]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05879703 0.0522404 0.0507392 0.04846442 0.04132211 0.03648785 0.03429508 0.03302094 0.03267851 0.03246176]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05659071 0.05528083 0.05210385 0.04548982 0.04292649 0.03602132 0.03487003 0.03336293 0.03299844 0.03278598]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.06528211 0.04721582 0.04661804 0.04336995 0.03919598 0.03808099 0.03531608 0.03371251 0.033694 0.03317246]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.0630435 0.04834306 0.04575783 0.04196414 0.03964949 0.03526503 0.03487161 0.03387254 0.03349206 0.03049976]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05545932 0.05177051 0.04693398 0.04062477 0.03917143 0.03553578 0.03469676 0.03296298 0.03258851 0.03216276]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.06339008 0.05630183 0.04607436 0.04030454 0.03916267 0.03508255 0.03450668 0.03430665 0.03356859 0.03223071]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.06059161 0.04968816 0.04589522 0.04554117 0.03722581 0.03665674 0.03453508 0.03406468 0.03367111 0.03244883]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.06927675 0.05016017 0.0469088 0.04455945 0.0390085 0.03840679 0.03504181 0.03490728 0.03482383 0.03470334]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05538836 0.05260566 0.04687396 0.0427742 0.04226086 0.03842244 0.0351139 0.03229955 0.03189397 0.03177029]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05748206 0.04797304 0.04609767 0.04582769 0.04476103 0.0384914 0.03759754 0.03701007 0.03528932 0.03293702]] labels = [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] scores = [[0.05729544 0.04858655 0.04721498 0.0436708 0.03922707 0.03854957
6. System information
- OS Platform Linux Ubuntu 16.04: Ubuntu 20.04
- Mobile device name if the issue happens on a mobile device:
- TensorFlow installed from (source or binary):
- TensorFlow version (use command below): 2.6.2
- Python version: 3.6.13
- Bazel version (if compiling from source):
- GCC/Compiler version (if compiling from source):
- CUDA/cuDNN version:
- GPU model and memory:
Hi @TingRui608470158,
Can you please try with tensorflow version 2.10.0 and I have tried to get inference before converting into tflite model using the provided model config and saved weights. Please find the gist here, I took some random example for detecting hand. If possible please provide a code snippet in colab to reproduce the issue reported here. Thanks!
This is inference from .pb model pb inference This is inference from .tflite model tflite inference
This is inference from .pb model pb inference This is inference from .tflite model tflite inference
did u figure out to fix it or not yet ?