flutter-tflite icon indicating copy to clipboard operation
flutter-tflite copied to clipboard

ArgumentError (Invalid argument(s): Unable to create interpreter.) with no indication about the error type

Open andynewman10 opened this issue 10 months ago • 0 comments

Several models I have can be used with flutter-tflite, which is great. Unfortunately, I have a few models that load successfully with ML Kit, but cannot be loaded with flutter-tflite. I have to use TF Lite and not ML Kit as my models have no final softmax (ML Kit can load the model successfully but I cannot do anything meaningful with it thereafter).

I am doing all my testing with flutter-tflite 0.10.1 on Android only for now, on a real device (Google Pixel 6a). My code is the following:

final options = InterpreterOptions();
// Use XNNPACK Delegate
options.addDelegate(XNNPackDelegate());
interpreter = await Interpreter.fromAsset(modelPath, options: options); // exception

The exception is

ArgumentError (Invalid argument(s): Unable to create interpreter.)

The model is fairly large, about 350MB. In the log, the only information I have is:

I/tflite  (28784): Created TensorFlow Lite XNNPACK delegate for CPU.
I/tflite  (28784): Initialized TensorFlow Lite runtime.

I already had models that were not loading successfully in the past (like this one), and in these cases there was an error being raised:

  • either in the body of the exception for ML Kit
  • in the console when using flutter-tflite.

How can I investigate what causes flutter-tflite to not load my model?

EDIT: the problem can be solved by commenting out the following line:

options.addDelegate(XNNPackDelegate());

Using GpuDelegateV2 causes the same error as XNNPackDelegate.

Any idea about why this particular model would cause trouble? Maybe a memory overflow?

andynewman10 avatar Sep 27 '23 16:09 andynewman10