tensorflow-onnx
tensorflow-onnx copied to clipboard
No error and no output model
Describe the bug
I tried to convert one specific TFLite model to ONNX. Everything went fine, there was no error, but the converter stopped before the generation of the output model. I'm not sure what exactly went wrong. Other TFLite models converted successfully. Please look at the command and the output below.
C:\models\tensorflow>python -m tf2onnx.convert --tflite detector.tflite --output detector.onnx
C:\Python\Lib\runpy.py:126: RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour
warn(RuntimeWarning(msg))
2023-07-14 14:12:52,354 - INFO - Using tensorflow=2.8.0, onnx=1.14.0, tf2onnx=1.14.0/25c977
2023-07-14 14:12:52,354 - INFO - Using opset <onnx, 15>
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
C:\models\tensorflow>
Urgency
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 18.04*): Windows 11 Pro 22H2
- TensorFlow Version: 2.8.0
- Python version: 3.10.4
- ONNX version (if applicable, e.g. 1.11*): 1.14.0
- ONNXRuntime version (if applicable, e.g. 1.11*): 1.15.1
To Reproduce
Here is a link to the TFLite model I was trying to convert.
Screenshots
Additional context
The error happened during loading tflite model by tensorflow. Will check if need to file an issue in tensorflow repo.
Thank you! There was no message displayed on screen.
i'm having the same issue
RuntimeWarning: 'tf2onnx.convert' found in sys.modules after import of package 'tf2onnx', but prior to execution of 'tf2onnx.convert'; this may result in unpredictable behaviour
the conversion stops after this line : INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Having the identical issue.
I am able to load and use the model with interpreter = tf.lite.Interpreter('model.tflite')
I am happy to try and debug the issue but not sure where to start.