flutter-tflite icon indicating copy to clipboard operation
flutter-tflite copied to clipboard

Issue with Unsupported Ops in TFLite Flutter Plugin

Open Elienvalleau opened this issue 1 year ago • 7 comments

Hello,

I'm encountering an issue when trying to run inference with a TensorFlow Lite model in Flutter. The model utilizes some TensorFlow ops that are not supported by the standard TensorFlow Lite interpreter. The specific error message I'm receiving is: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.

I've ensured that the tensorflow-lite-select-tf-ops dependency is included in my Android build, but the error persists. dependencies { implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:2.12.0' }

and here is how I converted my model converter = tf.lite.TFLiteConverter.from_saved_model('saved_model') converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops. tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops. ] tflite_model = converter.convert()

Is there any solution ? Thanks

Elienvalleau avatar Nov 05 '23 15:11 Elienvalleau

Hey. Have you tried that specific model with Android or iOS directly (without the Flutter layer on top of it) by chance? I have a feeling there's something specific about the model that just isn't convertable (tflite doesn't support all of the ops available in regular TensorFlow), so it might not be a go for this situation.

PaulTR avatar Nov 05 '23 15:11 PaulTR

Hey @PaulTR, I haven't had the opportunity to try the TFLite model directly on Android or iOS, but it works very well when run directly in Python. To provide more context, it's an OCR model that was originally in PaddlePaddle, which I converted to ONNX, then to TensorFlow, and finally to TensorFlow Lite. And the error also mention "Node number 407 (FlexConv2D) failed to prepare." Thx

Elienvalleau avatar Nov 06 '23 07:11 Elienvalleau

I had the same issue when i converted a Pytorch model to ONNX and then to Tensorflow. The model used was BERT base uncased

saurabhkumar8112 avatar Nov 20 '23 12:11 saurabhkumar8112

I had the same issue when i converted a Pytorch model to ONNX and then to Tensorflow. The model used was BERT base uncased

I honestly would be very surprised if it worked after you've converted between three separate frameworks and then into a TFLite version (which is a subset of TensorFlow) :) My guess is for yours you'd want to try to reduce some of those steps when creating the model.

PaulTR avatar Nov 20 '23 16:11 PaulTR

Yeah but at present there's no way to run a torch model(specially big one's like BERT) on edge devices using flutter. So the route is Pytorch -> ONNC -> Tensorflow -> TfLite

saurabhkumar8112 avatar Nov 21 '23 09:11 saurabhkumar8112

Yeah but at present there's no way to run a torch model(specially big one's like BERT) on edge devices using flutter. So the route is Pytorch -> ONNC -> Tensorflow -> TfLite

Were you able to run BERT using TFLite or any other ML Library in Flutter?

CristiSch avatar Feb 22 '24 18:02 CristiSch

I had the same issue. Could someone help pls?

vominhmanh avatar Apr 04 '24 09:04 vominhmanh