Facing issue with version incapitability between protobuffer library and MNN library while performing mnnquant quantization process using python API of mnn
python3 mnnquant.py /home/kpit/Incab
in_sensing_fy_24_25/Nithin/qnx/src/inferenceAppNew/inceptionV3_fp_32.mnn /home/kpit/Incabin_sensing_fy_24_25/Nithin/qnx/src/inferenceAp
pOll/test.mnn /home/kpit/Incabin_sensing_fy_24_25/Nithin/qnx/src/imageInputConfig.json
Traceback (most recent call last):
File "/home/kpit/Incabin_sensing_fy_24_25/Nithin/Alibaba_MNN/MNN/pymnn/pip_package/MNN/tools/mnnquant.py", line 8, in
There is no problem with the version of protobuffer since protobuffer version is ibprotoc 3.20.3
It seems libc++ not the same
The above issue is resolved, I tried using the pip version of the MNN and running the mnnquant.py , then it didn't throw the import error. However, it's now giving an aborted (core dumped) error in the middle of the quantization.
mnnquant test.mnn testquant.mnn /home/kpit/Incabin_sensing_fy_24_25/Nithin/qnx/src/imageInputConfig.json The device support i8sdot:0, support fp16:0, support i8mm: 0 The device support i8sdot:0, support fp16:0, support i8mm: 0 Aborted (core dumped)
You can try to use gdb mnnquant to debug the crash stack. It may be caused by the image path has invalid picture. In the same time you can use mnnconvert and add --weightQuantBits=8 to only quant the weight. And then use MNN_LOW_MEMORY to enable dynamic quant.
Marking as stale. No activity in 60 days.