tensorflow-onnx icon indicating copy to clipboard operation
tensorflow-onnx copied to clipboard

The inference results are pretty different after converting to onnx representation

Open liamsun2019 opened this issue 3 years ago • 1 comments

Hi, I encountered a problem when I converted a tf model to onnx format.

  1. The tf model can be downloaded from: https://tfhub.dev/google/movenet/multipose/lightning/1

  2. use tf2onnx to convert to onnx format: python3.6 -m tf2onnx.convert --saved-model movenet_multipose_lightning_1 --output model.onnx --opset=11 --inputs-as-nchw input where movenet_multipose_lightning_1 is the directory to which the downloaded gz file is uncompressed.

For the same inference code, the results between the tf model and the converted onnx model are pretty different. I am curious what could be the reason.

liamsun2019 avatar Dec 17 '21 09:12 liamsun2019

The results for me are same, could you please try them with latest version of tensorflow, onnxruntime and tf2onnx?

fatcat-z avatar Mar 18 '22 09:03 fatcat-z

It's been over 3 months, so closing this. Feel free to open a new one if the issue still exists.

fatcat-z avatar Oct 11 '22 05:10 fatcat-z