flutter-tflite icon indicating copy to clipboard operation
flutter-tflite copied to clipboard

Unable to create interpreter.

Open Quincy515 opened this issue 2 years ago • 7 comments

tflite_flutter: ^0.10.1

I trained model using ssd_moblienet_v1(or ssd_mobilenet_v2) and made tflite model.

 static Future<Interpreter> _loadModel() async {
    dev.log('Loading interpreter options...');
    final interpreterOptions = InterpreterOptions();

    // Use XNNPACK Delegate
    if (Platform.isAndroid) {
      interpreterOptions.addDelegate(XNNPackDelegate());
    }

    // Use Metal Delegate
    if (Platform.isIOS) {
      interpreterOptions.addDelegate(GpuDelegate());
    }

    dev.log('Loading interpreter...');
    return Interpreter.fromAsset(
      _modelPath,
      options: interpreterOptions..threads = 4,
    );
  }
```dart

Run with the following error

I/tflite (17536): Initialized TensorFlow Lite runtime. E/flutter (17536): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: Invalid argument(s): Unable to create interpreter. E/flutter (17536): #0 checkArgument (package:quiver/check.dart:45:5) E/flutter (17536): #1 new Interpreter._create (package:tflite_flutter/src/interpreter.dart:58:5) E/flutter (17536): #2 new Interpreter.fromBuffer (package:tflite_flutter/src/interpreter.dart:109:37) E/flutter (17536): #3 Interpreter.fromAsset (package:tflite_flutter/src/interpreter.dart:126:24) E/flutter (17536): E/flutter (17536): #4 Detector.start (package:object_detection_tflite_flutter/service/detector_service.dart:102:7) E/flutter (17536): E/flutter (17536): I/Camera (17536): startPreviewWithImageStream

Quincy515 avatar Jul 30 '23 09:07 Quincy515

Does your example work with the model that's included via the install script in one of our examples, or is it model specific?

PaulTR avatar Jul 31 '23 19:07 PaulTR

Does your example work with the model that's included via the install script in one of our examples, or is it model specific?

It is possible to use the tflite model of the example, but it is not possible to train the model generated by myself, and I do not know how to solve it.

Quincy515 avatar Aug 02 '23 03:08 Quincy515

How are you training the model that you're using? Is it with TensorFlow Lite Model Maker, or are you creating a TensorFlow model that you convert to lite, or are you using some other tool? I'm wondering if the model process is doing something funny with it.

PaulTR avatar Aug 21 '23 15:08 PaulTR

Hi @PaulTR, Thank you for your brilliant library.

I tried to train the model with TensorFlow and the model is working well with testing script, please see video below: https://github.com/tensorflow/flutter-tflite/assets/8953129/4e174215-aa59-49c4-b8a5-b6cdd2b1ad81

After that, I convert exported graph file into TFLite model file with script below: Screenshot 2023-09-14 at 12 15 58 PM

Then I used tflite model with live_object_detection_ssd_mobilenet sample, and I got this issue. Screenshot 2023-09-14 at 12 18 02 PM

Do you have any solution to fix it ? Thank you so much

vankhoa01 avatar Sep 14 '23 05:09 vankhoa01

image

I was getting the same error. So, I tried to remove the XNNPACK delegate in case of android, and the code works after that.

// Removed this piece of code
if (Platform.isAndroid) {
      options.addDelegate(XNNPackDelegate());
}

After removing this code, my model is loading perfectly.

BilalKashif avatar Mar 13 '24 01:03 BilalKashif

I'm facing exactly the same issue. My model is trained using this tutorial: https://colab.research.google.com/github/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/Train_TFLite2_Object_Detction_Model.ipynb

wildsurfer avatar May 17 '24 12:05 wildsurfer

Did anyone find a fix?

andresgd7 avatar Jul 25 '24 05:07 andresgd7