sample-tensorflow-imageclassifier icon indicating copy to clipboard operation
sample-tensorflow-imageclassifier copied to clipboard

Custom Trained Model Not Working

Open meetmustafa opened this issue 7 years ago • 4 comments

This application is working fine with mobilenet_quant_v1_224.tflite model. I've trained custom model following Tensorflow for Poet Google Codelab and created graph using this script: IMAGE_SIZE=224 ARCHITECTURE="mobilenet_0.50_${IMAGE_SIZE}" python -m scripts.retrain
--bottleneck_dir=tf_files/bottlenecks
--how_many_training_steps=500
--model_dir=tf_files/models/
--summaries_dir=tf_files/training_summaries/"${ARCHITECTURE}"
--output_graph=tf_files/retrained_graph.pb
--output_labels=tf_files/retrained_labels.txt
--architecture="${ARCHITECTURE}"
--image_dir=tf_files/flower_photos

and for this Android Things sample to train Lite model I've followed tensorflow-for-poets-2-tflite google Codelab and converted using this script toco
--input_file=tf_files/retrained_graph.pb
--output_file=tf_files/optimized_graph.lite
--input_format=TENSORFLOW_GRAPHDEF
--output_format=TFLITE
--input_shape=1,${IMAGE_SIZE},${IMAGE_SIZE},3
--input_array=input
--output_array=final_result
--inference_type=FLOAT
--input_data_type=FLOAT

after capturing from raspberry pi 3 model b it is giving me this error 2018-06-28 12:13:09.115 7685-7735/com.example.androidthings.imageclassifier E/AndroidRuntime: FATAL EXCEPTION: BackgroundThread Process: com.example.androidthings.imageclassifier, PID: 7685 java.lang.IllegalArgumentException: Failed to get input dimensions. 0-th input should have 602112 bytes, but found 150528 bytes. at org.tensorflow.lite.NativeInterpreterWrapper.getInputDims(Native Method) at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:98) at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:142) at org.tensorflow.lite.Interpreter.run(Interpreter.java:120) at com.example.androidthings.tensorflow.classifier.TensorFlowImageClassifier.doRecognize(TensorFlowImageClassifier.java:99) at com.example.androidthings.tensorflow.ImageClassifierActivity.onImageAvailable(ImageClassifierActivity.java:244) at android.media.ImageReader$ListenerHandler.handleMessage(ImageReader.java:812) at android.os.Handler.dispatchMessage(Handler.java:106) at android.os.Looper.loop(Looper.java:164) at android.os.HandlerThread.run(HandlerThread.java:65)

Please help with this and i am a beginner with tensorflow.

meetmustafa avatar Jun 28 '18 06:06 meetmustafa

Same here, not able to produce a custom tflite file, rather I can only get the .lite file and this is not working.

aashutoshrathi avatar Jul 07 '18 08:07 aashutoshrathi

Yes, I am struggling with that couple of hours but without result

iskuhis avatar Jul 28 '18 10:07 iskuhis

@iskuhis I fixed it you can check at https://github.com/aashutoshrathi/vision

aashutoshrathi avatar Jul 28 '18 12:07 aashutoshrathi

@iskuhis the reason is because original model for androidthings has quantized inputs, so instead of 4 bytes it has only one. You should either switch back to 4 bytes, or adapt how you export your model

lc0 avatar Oct 20 '18 23:10 lc0