SR_Mobile_Quantization
SR_Mobile_Quantization copied to clipboard
I want to train the model and then I need a int8 onnx model, what should I do step by step?
Dear @NJU-Jet :
I feel confused that if I use the command "python train.py --opt options/train/base7.yaml --name base7_D4C28_bs16ps64_lr1e-3 --scale 3 --bs 16 --ps 64 --lr 1e-3 --gpu_ids 0" to train a model, this model is a float32 model or a int8 model?
I want to train the model and then I need a int8 onnx model, what should I do step by step? do I need to run or modify generate_tflite.py to get a int8 model and then convert to onnx int8 model? or just the pb model we got from the command "python train.py --opt options/train/base7.yaml --name base7_D4C28_bs16ps64_lr1e-3 --scale 3 --bs 16 --ps 64 --lr 1e-3 --gpu_ids 0" is enough to convert to a int8 onnx model?
Thank you very much!
Hi, @xiaoxiongli : You will get a float32 model (.pb file) by running this command and then you can convert it to onnx model. Please refer: https://github.com/onnx/tensorflow-onnx#getting-started.
As far as I know, there isn't a direct way to convert .tflite to onnx model. :)
Dear @NJU-Jet :
my goal is that to run the int8 model in openvino or mnn or tnn, of cource I can use "https://github.com/onnx/tensorflow-onnx#getting-started" you mentioned before to convert the pb model to onnx model, but this convertion can only got a float32 onnx model, How can I use int8 model in openvino or mnn or tnn?
Can I change the float32 onnx model to int8 model using openvino/mnn/tnn, does this way affect the PSNR?
please help... sorry for disturbs you again, and thank you very much~
Maybe this works: https://github.com/jackwish/tflite2onnx. Don't worry, it's my pleasure to help you.
In my understanding, you can change the float32 onnx model to int8 model using openvino/mnn/tnn and PSNR will drop within 0.1dB, because the quantization algorithm is the same as tflite.
it's clear, I got it! it's very kind of you ^_^
Hi@xiaoxiongli, I meet the same point with u, convert INT8 or FP16 tflite model to onnx format, Do you have good advice? Thank u~