SR_Mobile_Quantization
SR_Mobile_Quantization copied to clipboard
How to convert tflite model to onnx model?
it is good work and congratulation!
I have to two questions that:
- which framwork do you use: tensorflow or tensorflow_lite?
- if you use tensorflow_lite, How to convert the output tflite model to onnx model?
please help... thank you very much! double click 666 ^_^
Hello:)
- We use tensorflow during training, and after training we convert the saved model to .tflite model.
- I uploaded .pb file in experiment directory just now, so you can convert it to onnx model.
python -m tf2onnx.convert --saved-model experiment/base7_D4C28_bs16ps64_lr1e-3/best_status/ --output model.onnx --opset 11 --verbose --inputs-as-nchw serving_default_input_1:0
error:
python -m tf2onnx.convert --saved-model experiment/base7_D4C28_bs16ps64_lr1e-3/best_status/ --output model.onnx --opset 11 --verbose --inputs-as-nchw serving_default_input_1:0
error:
change tf 2.3.1 is ok.