tensorflow-onnx icon indicating copy to clipboard operation
tensorflow-onnx copied to clipboard

[BUG] Different inference results of tflite model and onnx model(convert from tflite model)

Open zhuxiaoxuhit opened this issue 3 years ago • 2 comments

Describe the bug I tried to convert tflite model to onnx model, and then do inference by onnxruntime python api. I got different inference results of tflite model and onnx model(convert from tflite model).

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 16.04
  • Tensorflow Version: 1.13.1
  • Python version: 3.6
  • tf2onnx version: 1.9.3
  • onnxruntime version: 1.10.0

To Reproduce tflite file: https://drive.google.com/file/d/1wYMaZdT3ZzZejJCESLocmOL9s3IH4vDc/view?usp=sharing tflite inference:

import tensorflow as tf
import numpy as np

np.set_printoptions(threshold=np.inf)
np.set_printoptions(suppress=True)

lite_path = 'model_ch_concat.lite'
interpreter = tf.lite.Interpreter(lite_path)

interpreter.allocate_tensors()
input_detail = interpreter.get_input_details()
output_detail = interpreter.get_output_details()

input_array = np.zeros(60, dtype=np.int32).reshape(1,60)
input_array[0][0] = 2448
input_array[0][1] = 5037
input_array[0][2] = 5037
input_array[0][3] = 4082
input_array[0][4] = 6384
input_array[0][5] = 3419
input_array[0][6] = 5871

interpreter.set_tensor(input_detail[0]['index'], input_array)
interpreter.invoke()
output_data = interpreter.get_tensor(output_detail[0]['index'])

print(output_data)

convert tflite to onnx:

python -m tf2onnx.convert --tflite model_ch_concat.lite --opset 15 --output model_ch_concat_20220110.onnx

onnx file form tflite: https://drive.google.com/file/d/1JLTXz_-Zdgy_5mgEjawxvw93oIBsWxhf/view?usp=sharing onnxruntime inference:

import numpy as np
import onnxruntime as ort 
np.set_printoptions(threshold=np.inf)

options = ort.SessionOptions()

sess_ort = ort.InferenceSession("model_ch_concat_20220110.onnx")
in0 = sess_ort.get_inputs()[0].name         
out0 = sess_ort.get_outputs()[0].name   
input_array = np.zeros(60, dtype=np.int32).reshape(1,60)
input_array[0][0] = 2448
input_array[0][1] = 5037
input_array[0][2] = 5037
input_array[0][3] = 4082
input_array[0][4] = 6384
input_array[0][5] = 3419
input_array[0][6] = 5871
res = sess_ort.run(None,input_feed={in0:input_array})
print(res[0])

Screenshots tflite inference result: image onnxruntime inference result: image

zhuxiaoxuhit avatar Jan 10 '22 12:01 zhuxiaoxuhit

I also encountered this problem, did you solve it?

xxoospring avatar Jun 13 '22 02:06 xxoospring

I also encountered this problem, did you solve it?

Didn't. I used tflite finally. It works well.

zhuxiaoxuhit avatar Jun 13 '22 02:06 zhuxiaoxuhit