tflite_flutter_plugin icon indicating copy to clipboard operation
tflite_flutter_plugin copied to clipboard

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference

Open anovis opened this issue 3 years ago • 15 comments

Getting the error Regular TensorFlow ops are not supported by this interpreter. Same as #63

I/tflite  ( 7126): Initialized TensorFlow Lite runtime.
E/tflite  ( 7126): Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
E/tflite  ( 7126): Node number 0 (FlexSentencepieceOp) failed to prepare.
E/flutter ( 7126): [ERROR:flutter/lib/ui/ui_dart_state.cc(166)] Unhandled Exception: Bad state: failed precondition

I am also using a converted tensorflow2.0 text model, which required using SELECT_TF_OPS while doing the conversion

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()

I came across https://www.tensorflow.org/lite/guide/ops_select#android_aar, which suggests adding a dependency to build.graddle , which i tried and got the same error.

dependencies {
    implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly-SNAPSHOT'
    // This dependency adds the necessary TF op support.
    implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:0.0.0-nightly-SNAPSHOT'
}

Wondering if anyone else ran into this and suggestions for how I could proceed.

anovis avatar Apr 16 '21 00:04 anovis

@anovis Can you try downloading the flex binaries from here.

  1. Replace libtensorflowlite_c.so in android/app/src/main/jniLibs/arm64-v8a with libtensorflowlite_c_arm64_flex.so.
  2. Now rename libtensorflowlite_c_arm64_flex.so to libtensorflowlite_c.so
  3. Same steps for libtensorflowlite_c_arm_flex.so. and android/app/src/main/jniLibs/armeabi-v7a

am15h avatar Apr 17 '21 07:04 am15h

@am15h Thanks for the help. I replaced those binaries and got the same error. Also for more context I am running on an android emulater on ubuntu flutter channel 1.20.4 and tflite_flutter: ^0.5.0.

anovis avatar Apr 17 '21 14:04 anovis

Please try running on a real device. Flex binaries are not available of emulators yet.

On Sat, 17 Apr 2021, 19:31 Austen, @.***> wrote:

@am15h https://github.com/am15h Thanks for the help. I replaced those binaries and got the same error. Also for more context I am running on an android emulater on ubuntu flutter channel 1.20.4 and tflite_flutter: ^0.5.0.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/am15h/tflite_flutter_plugin/issues/101#issuecomment-821827366, or unsubscribe https://github.com/notifications/unsubscribe-auth/AC6ZDYCRCPKCWIOA4DMHWATTJGIEJANCNFSM43APMEMA .

am15h avatar Apr 17 '21 14:04 am15h

Thanks so ran on my android device and recieved

I/tflite  (31894): Initialized TensorFlow Lite runtime.
I/tflite  (31894): Created TensorFlow Lite delegate for select TF ops.
I/tflite  (31894): TfLiteFlexDelegate delegate: 7 nodes delegated out of 284 nodes with 3 partitions.
E/tflite  (31894): Op type not registered 'SentencepieceOp' in binary running on localhost. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
E/tflite  (31894): Delegate kernel was not initialized
E/tflite  (31894): Node number 284 (TfLiteFlexDelegate) failed to prepare.

anovis avatar Apr 17 '21 14:04 anovis

Looking at that error it looks like it is missing tensorflow_text binaries https://github.com/tensorflow/hub/issues/463

anovis avatar Apr 17 '21 14:04 anovis

Can you try following these steps while converting to tflite? https://www.tensorflow.org/lite/guide/op_select_allowlist#tensorflow_text_and_sentencepiece_operators

am15h avatar May 18 '21 18:05 am15h

i had originally build the model using

converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()

The only difference I can see is import sentencepiece as spm, so I will go back and add that

anovis avatar May 21 '21 16:05 anovis

@anovis, were you able to get it working?

am15h avatar Jun 28 '21 11:06 am15h

@anovis i have made a fork, modified it to have an interpreter option to add flex delegate. It's a little 'hacky' solution for the android platform based on existing flex JNI libs.

cpoohee avatar Jul 07 '21 05:07 cpoohee

@cpoohee thanks! I will try that later this week and report back!

anovis avatar Jul 08 '21 02:07 anovis

@anovis Have you solved your problem? I've found out about this https://www.tensorflow.org/lite/inference_with_metadata/lite_support#getting_started I've tried using implementation 'org.tensorflow:tensorflow-lite-support:0.0.0-nightly-SNAPSHOT' which has this https://github.com/tensorflow/tflite-support/tree/master/tensorflow_lite_support/custom_ops/kernel/sentencepiece but nothing happened

Extremesarova avatar Jul 26 '21 14:07 Extremesarova

@anovis Hi! Any updates on your issue? :)

Extremesarova avatar Oct 22 '21 06:10 Extremesarova

Please try running on a real device. Flex binaries are not available of emulators yet.

@am15h do we have flex binaries for x86? I am able to make it work on phones (ARM) after replacing .so files, but it is still not working on emulator (x86).

clive107 avatar Feb 25 '22 08:02 clive107

Hello @am15h ! Is it possible to have Flex binaries on emulator now? If not, how to work around this problem?

AntoineChauviere avatar Sep 26 '22 09:09 AntoineChauviere

Hello,

I tried to implement autocomplete example from google(https://github.com/tensorflow/examples/tree/master/lite/examples/generative_ai/android) on flutter, but tf_ops error pops up for me too during debug session. There is no error on android native app from google examples.

Here is the brief conversion code: @tf.function def generate(prompt, max_length): return gpt2_lm.generate(prompt, max_length)

concrete_func = generate.get_concrete_function(tf.TensorSpec([], tf.string), 100)

gpt2_lm.jit_compile = False converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func], gpt2_lm) converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops. tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops. ] converter.allow_custom_ops = True converter.target_spec.experimental_select_user_tf_ops = ["UnsortedSegmentJoin", "UpperBound"] converter._experimental_guarantee_all_funcs_one_use = True generate_tflite = converter.convert()

I think that there is no support for experimental tf_ops right now at tflite_flutter plugin. Could you please support me ?

galaturka avatar Jun 15 '23 21:06 galaturka