MNN icon indicating copy to clipboard operation
MNN copied to clipboard

tflite模型转换时Check failed: customOPCode == "TFLite_Detection_PostProcess" ==> Now Only support Custom op of 'TFLite_Detection_PostProcess'

Open lszdbz opened this issue 2 years ago • 3 comments

平台(如果交叉编译请再附上交叉编译目标平台):

Platform(Include target platform as well if cross-compiling):

Linux

Github版本:

Github Version:

commit 728796b21c021a121059f5d6f9730f821fbd0225

编译方式:

Compiling Method

cmake

cmake .. -DMNN_BUILD_CONVERTER=true
make -j4


编译日志:

Build Log:

执行转换命令:
./MNNConvert -f TFLITE --modelFile model_en.tflite  --MNNModel model-en.mnn --bizCode biz

The device support i8sdot:0, support fp16:0, support i8mm: 0
Start to Convert Other Model Format To MNN Model...
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:29: Check failed: customOPCode == "TFLite_Detection_PostProcess" ==> Now Only support Custom op of 'TFLite_Detection_PostProcess'
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:61: Check failed: tfliteOp->inputs.size() == 3 ==> TFLite_Detection_PostProcess should have 3 inputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:62: Check failed: tfliteOp->outputs.size() == 4 ==> TFLite_Detection_PostProcess should have 4 outputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:29: Check failed: customOPCode == "TFLite_Detection_PostProcess" ==> Now Only support Custom op of 'TFLite_Detection_PostProcess'
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:61: Check failed: tfliteOp->inputs.size() == 3 ==> TFLite_Detection_PostProcess should have 3 inputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:62: Check failed: tfliteOp->outputs.size() == 4 ==> TFLite_Detection_PostProcess should have 4 outputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:29: Check failed: customOPCode == "TFLite_Detection_PostProcess" ==> Now Only support Custom op of 'TFLite_Detection_PostProcess'
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:61: Check failed: tfliteOp->inputs.size() == 3 ==> TFLite_Detection_PostProcess should have 3 inputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:62: Check failed: tfliteOp->outputs.size() == 4 ==> TFLite_Detection_PostProcess should have 4 outputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:29: Check failed: customOPCode == "TFLite_Detection_PostProcess" ==> Now Only support Custom op of 'TFLite_Detection_PostProcess'
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:61: Check failed: tfliteOp->inputs.size() == 3 ==> TFLite_Detection_PostProcess should have 3 inputs!
[16:35:08] /media/tongli/lt/MNN-github/MNN-new/tools/converter/source/tflite/CustomTflite.cpp:62: Check failed: tfliteOp->outputs.size() == 4 ==> TFLite_Detection_PostProcess should have 4 outputs!
Start to Optimize the MNN Net...

之后就一直卡在这里

lszdbz avatar Jun 17 '23 01:06 lszdbz

存在不支持 op ,试下能不能先转成 onnx 再转 mnn

jxt1234 avatar Jun 21 '23 02:06 jxt1234

已经转成onnx模型了,onnx转mnn可以转,但是好像不太正确,使用testMNNFromOnnx.py进行正确性校验的结果是:

python ../tools/script/testMNNFromOnnx.py model/model_en_696856.onnx

Dir exist onnx/test.onnx tensor(int32) tensor(int32) tensor(int32) ['tf.nn.softmax'] inputs: input_ids onnx/ input_mask onnx/ segment_ids onnx/ 2023-06-26 11:44:24.691194640 [E:onnxruntime:, sequential_executor.cc:494 ExecuteKernel] Non-zero status code returned while running Gather node. Name:'model/bert_model/embedding_postprocessor/Gather' Status Message: indices element out of data bounds, idx=4 must be within the inclusive range [-2,1] Traceback (most recent call last): File "../tools/script/testMNNFromOnnx.py", line 283, in t.Test() File "../tools/script/testMNNFromOnnx.py", line 259, in Test self.__run_onnx() File "../tools/script/testMNNFromOnnx.py", line 168, in __run_onnx outputs = ort_session.run(None, inputs) File "/home/tongli/anaconda3/envs/tf2-onnx/lib/python3.7/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 200, in run return self._sess.run(output_names, input_feed, run_options) onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running Gather node. Name:'model/bert_model/embedding_postprocessor/Gather' Status Message: indices element out of data bounds, idx=4 must be within the inclusive range [-2,1] 请问这个能解决吗? @jxt1234

lszdbz avatar Jun 26 '23 03:06 lszdbz

I am facing the same issue with one of the block of llm model that I am trying to convert

Nick-infinity avatar Jan 16 '24 16:01 Nick-infinity

Marking as stale. No activity in 60 days.

github-actions[bot] avatar Mar 17 '24 09:03 github-actions[bot]