tflite-android-transformers icon indicating copy to clipboard operation
tflite-android-transformers copied to clipboard

model generated by model_generation but not able to invoke.

Open o20021106 opened this issue 5 years ago • 4 comments

I tried to convert tensorflow model to tflite using model_generation/distilbert.py.

I was able to convert and save the model without error, but could not allocate_tensor with python API and invoke with interpreter also failed with RuntimeError

RuntimeError: Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.Node number 0 (FlexShape) failed to prepare.

What should I do to fix this error?

Here's my colab to reproduce the error colab tf-nightly-gpu==2.2.0.dev20200115

o20021106 avatar Jan 17 '20 04:01 o20021106

@o20021106 Hello, I met the same problem. Have you solved it? Thanks!

fuzhenxin avatar Jun 22 '20 11:06 fuzhenxin

My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

fuzhenxin avatar Jun 23 '20 01:06 fuzhenxin

My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

Yup, helped. Thanks!

bartekcensorpic avatar Aug 27 '20 13:08 bartekcensorpic

My problem was solved by changing supported_ops in python to converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]

Can I continue to use this package?tensorflow-lite-with-select-tf-ops-0.0.0-nightly.aar

songdlut avatar Jun 29 '22 09:06 songdlut