SR_Mobile_Quantization icon indicating copy to clipboard operation
SR_Mobile_Quantization copied to clipboard

How to convert tflite model to onnx model?

Open xiaoxiongli opened this issue 3 years ago • 3 comments

it is good work and congratulation!

I have to two questions that:

  1. which framwork do you use: tensorflow or tensorflow_lite?
  2. if you use tensorflow_lite, How to convert the output tflite model to onnx model?

please help... thank you very much! double click 666 ^_^

xiaoxiongli avatar May 31 '21 12:05 xiaoxiongli

Hello:)

  1. We use tensorflow during training, and after training we convert the saved model to .tflite model.
  2. I uploaded .pb file in experiment directory just now, so you can convert it to onnx model.

NJU-Jet avatar May 31 '21 12:05 NJU-Jet

python -m tf2onnx.convert --saved-model experiment/base7_D4C28_bs16ps64_lr1e-3/best_status/ --output model.onnx --opset 11 --verbose --inputs-as-nchw serving_default_input_1:0

error: image

simplew2011 avatar Aug 08 '23 06:08 simplew2011

python -m tf2onnx.convert --saved-model experiment/base7_D4C28_bs16ps64_lr1e-3/best_status/ --output model.onnx --opset 11 --verbose --inputs-as-nchw serving_default_input_1:0

error: image

change tf 2.3.1 is ok.

simplew2011 avatar Aug 08 '23 09:08 simplew2011