flutter_tflite
flutter_tflite copied to clipboard
Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [[F (which is compatible with the TensorFlowLite type FLOAT32).
Possible duplicate of https://github.com/shaqian/flutter_tflite/issues/53
Hello,
This is my Flutter and Dart versions:
==> flutter --version
Flutter 1.22.2 • channel stable • https://github.com/flutter/flutter.git
Framework • revision 84f3d28555 (5 days ago) • 2020-10-15 16:26:19 -0700
Engine • revision b8752bbfff
Tools • Dart 2.10.2
I configured my project, based on what the documentation says. When I launch the app on my Android device then I can see that the model
is success
however the app crashes in this line: Tflite.runModelOnImage()
.
This is the block of the code:
Future<void> getArtWork() async {
final model = await Tflite.loadModel(
model: "assets/ml_model/model.tflite",
labels: "assets/ml_model/dict.txt",
);
var recognitions = await Tflite.runModelOnImage(
path: imagePath,
threshold: 0.5,
);
print('Recognitions: $recognitions');
await Tflite.close();
}
This is the error message in the log file:
E/AndroidRuntime(15650): java.lang.RuntimeException: An error occurred while executing doInBackground()
E/AndroidRuntime(15650): at android.os.AsyncTask$4.done(AsyncTask.java:399)
E/AndroidRuntime(15650): at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:383)
E/AndroidRuntime(15650): at java.util.concurrent.FutureTask.setException(FutureTask.java:252)
E/AndroidRuntime(15650): at java.util.concurrent.FutureTask.run(FutureTask.java:271)
E/AndroidRuntime(15650): at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:289)
E/AndroidRuntime(15650): at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
E/AndroidRuntime(15650): at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
E/AndroidRuntime(15650): at java.lang.Thread.run(Thread.java:919)
E/AndroidRuntime(15650): Caused by: java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite tensor with type UINT8 and a Java object of type [[F (which is compatible with the TensorFlowLite type FLOAT32).
E/AndroidRuntime(15650): at org.tensorflow.lite.Tensor.throwIfTypeIsIncompatible(Tensor.java:406)
E/AndroidRuntime(15650): at org.tensorflow.lite.Tensor.copyTo(Tensor.java:251)
E/AndroidRuntime(15650): at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:170)
E/AndroidRuntime(15650): at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:343)
E/AndroidRuntime(15650): at org.tensorflow.lite.Interpreter.run(Interpreter.java:304)
E/AndroidRuntime(15650): at sq.flutter.tflite.TflitePlugin$RunModelOnImage.runTflite(TflitePlugin.java:481)
E/AndroidRuntime(15650): at sq.flutter.tflite.TflitePlugin$TfliteTask.doInBackground(TflitePlugin.java:448)
E/AndroidRuntime(15650): at sq.flutter.tflite.TflitePlugin$TfliteTask.doInBackground(TflitePlugin.java:422)
E/AndroidRuntime(15650): at android.os.AsyncTask$3.call(AsyncTask.java:378)
E/AndroidRuntime(15650): at java.util.concurrent.FutureTask.run(FutureTask.java:266)
E/AndroidRuntime(15650): ... 4 more```
I tried to use runModelOnBinary
instead and got a similar error in the log file.
I/flutter (27793): Process to get the Artwork started...
I/flutter (27793): the passed image path: /storage/emulated/0/smart_scanner/crop_1754.jpeg
I/flutter (27793): Try to load the model...
I/tflite (27793): Initialized TensorFlow Lite runtime.
I/flutter (27793): state of reading TF model from asset: success
I/flutter (27793): Try to change image to binary. Image path: /storage/emulated/0/smart_scanner/crop_1754.jpeg
I/flutter (27793): Try to compress the file. Original length: 693874
I/flutter (27793): Image converted to binary. Binary size: 25140
E/AndroidRuntime(27793): FATAL EXCEPTION: AsyncTask #1
E/AndroidRuntime(27793): Process: com.atco.techne, PID: 27793
E/AndroidRuntime(27793): java.lang.RuntimeException: An error occurred while executing doInBackground()
E/AndroidRuntime(27793): at android.os.AsyncTask$4.done(AsyncTask.java:399)
E/AndroidRuntime(27793): at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:383)
E/AndroidRuntime(27793): at java.util.concurrent.FutureTask.setException(FutureTask.java:252)
E/AndroidRuntime(27793): at java.util.concurrent.FutureTask.run(FutureTask.java:271)
E/AndroidRuntime(27793): at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:289)
E/AndroidRuntime(27793): at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
E/AndroidRuntime(27793): at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
E/AndroidRuntime(27793): at java.lang.Thread.run(Thread.java:919)
E/AndroidRuntime(27793): Caused by: java.lang.IllegalArgumentException: Cannot copy to a TensorFlowLite tensor (image) with 150528 bytes from a Java Buffer with 25140 bytes.
E/AndroidRuntime(27793): at org.tensorflow.lite.Tensor.throwIfSrcShapeIsIncompatible(Tensor.java:423)
E/AndroidRuntime(27793): at org.tensorflow.lite.Tensor.setTo(Tensor.java:189)
E/AndroidRuntime(27793): at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:154)
E/AndroidRuntime(27793): at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:343)
E/AndroidRuntime(27793): at org.tensorflow.lite.Interpreter.run(Interpreter.java:304)
E/AndroidRuntime(27793): at sq.flutter.tflite.TflitePlugin$RunModelOnBinary.runTflite(TflitePlugin.java:507)
E/AndroidRuntime(27793): at sq.flutter.tflite.TflitePlugin$TfliteTask.doInBackground(TflitePlugin.java:448)
E/AndroidRuntime(27793): at sq.flutter.tflite.TflitePlugin$TfliteTask.doInBackground(TflitePlugin.java:422)
E/AndroidRuntime(27793): at android.os.AsyncTask$3.call(AsyncTask.java:378)
E/AndroidRuntime(27793): at java.util.concurrent.FutureTask.run(FutureTask.java:266)
E/AndroidRuntime(27793): ... 4 more
My image size WAS around 2000000 bytes for the .png format and around 1000000 bytes for the jpeg image. So I got this error message:
Caused by: java.lang.IllegalArgumentException: Cannot copy to a TensorFlowLite tensor (image) with 150528 bytes from a Java Buffer with 2000000 bytes.
E/AndroidRuntime(27793): at org.tensorflow.lite.Tensor.throwIfSrcShapeIsIncompatible(Tensor.java:423)
So, I thought the image size should be less than 150528
bytes. Therefore, I used FlutterImageCompress to reduce the length of the image file. However, it didn't fix anything and I saw the log you see at the top of this page.
This is my code:
Future<void> getArtWork(BuildContext context) async {
print('Process to get the Artwork started...');
print('the passed image path: $imagePath');
print('Try to load the model...');
try {
final res = await Tflite.loadModel(
model: "assets/ml_model/model.tflite",
labels: "assets/ml_model/dict.txt",
);
print('state of reading TF model from asset: $res');
final File imageFile = new File(imagePath);
if (imageFile == null) {
throw Exception('File not found: $imagePath');
}
print('Try to change image to binary. Image path: $imagePath');
var binary = await imageFile.readAsBytes();
if(binary != null || binary.isNotEmpty) {
if(binary.length > 150528) {
print('Try to compress the file. Original length: ${binary.length}');
binary = await FlutterImageCompress.compressWithFile(
imageFile.absolute.path,
minWidth: 300,
minHeight: 300,
quality: 90,
);
}
print('Image converted to binary. Binary size: ${binary.length}');
var recognitions = await Tflite.runModelOnBinary(
binary: binary,
threshold: 0.5,
);
print('Recognitions: $recognitions');
} else {
print('Couldn\'t turn image into binary!');
}
} catch (e) {
print(e);
_navigateToResultScreen(context, null);
} finally {
await Tflite.close();
}
}
Anyway, I think I gave up!
I am having similar problem with a model I generated using Google Firebase AutoML. I am trying to figure out if the AutoML model is not supported or if there is another problem with the tflite plugin.
@syeds-git I am able to get AutoML into work using the following libs:
- https://pub.dev/packages/tflite_flutter
- https://pub.dev/packages/tflite_flutter_helper
I wasn't able to get AutoML into work using this lib, unfortunately.
@Hesamedin I just got a tflite model working using this plugin. I generated a custom model using
- https://teachablemachine.withgoogle.com/train/image
I feel google should have done more on the flutter end to introduce official plugins. It has been a painful journey and also a bit risky to rely on plugins which have no official support :(
Thanks for the recommendation though, I'll try them out.
same error when generating tflite from https://colab.research.google.com/