tensorflow-onnx
tensorflow-onnx copied to clipboard
The inference results are pretty different after converting to onnx representation
Hi, I encountered a problem when I converted a tf model to onnx format.
-
The tf model can be downloaded from: https://tfhub.dev/google/movenet/multipose/lightning/1
-
use tf2onnx to convert to onnx format: python3.6 -m tf2onnx.convert --saved-model movenet_multipose_lightning_1 --output model.onnx --opset=11 --inputs-as-nchw input where movenet_multipose_lightning_1 is the directory to which the downloaded gz file is uncompressed.
For the same inference code, the results between the tf model and the converted onnx model are pretty different. I am curious what could be the reason.
The results for me are same, could you please try them with latest version of tensorflow, onnxruntime and tf2onnx?
It's been over 3 months, so closing this. Feel free to open a new one if the issue still exists.