tensorflow-onnx icon indicating copy to clipboard operation
tensorflow-onnx copied to clipboard

No error and no output model

Open rfilkov opened this issue 2 years ago • 4 comments

Describe the bug

I tried to convert one specific TFLite model to ONNX. Everything went fine, there was no error, but the converter stopped before the generation of the output model. I'm not sure what exactly went wrong. Other TFLite models converted successfully. Please look at the command and the output below.

C:\models\tensorflow>python -m tf2onnx.convert --tflite detector.tflite --output detector.onnx
C:\Python\Lib\runpy.py:126: RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour
  warn(RuntimeWarning(msg))
2023-07-14 14:12:52,354 - INFO - Using tensorflow=2.8.0, onnx=1.14.0, tf2onnx=1.14.0/25c977
2023-07-14 14:12:52,354 - INFO - Using opset <onnx, 15>
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.

C:\models\tensorflow>

Urgency

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): Windows 11 Pro 22H2
  • TensorFlow Version: 2.8.0
  • Python version: 3.10.4
  • ONNX version (if applicable, e.g. 1.11*): 1.14.0
  • ONNXRuntime version (if applicable, e.g. 1.11*): 1.15.1

To Reproduce

Here is a link to the TFLite model I was trying to convert.

Screenshots

Additional context

rfilkov avatar Jul 14 '23 11:07 rfilkov

The error happened during loading tflite model by tensorflow. Will check if need to file an issue in tensorflow repo.

fatcat-z avatar Jul 26 '23 17:07 fatcat-z

Thank you! There was no message displayed on screen.

rfilkov avatar Jul 28 '23 13:07 rfilkov

i'm having the same issue

RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour

the conversion stops after this line : INFO: Created TensorFlow Lite XNNPACK delegate for CPU. eorror

varunatohilo avatar Jun 05 '24 10:06 varunatohilo

Having the identical issue. I am able to load and use the model with interpreter = tf.lite.Interpreter('model.tflite') I am happy to try and debug the issue but not sure where to start.

LucosidE avatar Aug 06 '24 17:08 LucosidE