YOLOP icon indicating copy to clipboard operation
YOLOP copied to clipboard

ONNX simplification crash (in export_onnx.py)

Open sahamitul opened this issue 2 years ago • 8 comments

python export_onnx.py --height 640 --width 640

gives this crash during simplification stage:- ... simplifying with onnx-simplifier 0.3.7... Traceback (most recent call last): File "export_onnx.py", line 178, in model_onnx, check = onnxsim.simplify(model_onnx, check_n=3) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 492, in simplify model = fixed_point(model, infer_shapes_and_optimize, constant_folding) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 393, in fixed_point x = func_b(x) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 486, in constant_folding custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 232, in forward_for_node_outputs custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 198, in forward ), sess_options=sess_options, providers=['CPUExecutionProvider']) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Mul_948) Op (Mul) [ShapeInferenceError] Incompatible dimensions

Any ideas? Thanks!

sahamitul avatar Mar 28 '22 20:03 sahamitul

python export_onnx.py --height 640 --width 640

gives this crash during simplification stage:- ... simplifying with onnx-simplifier 0.3.7... Traceback (most recent call last): File "export_onnx.py", line 178, in model_onnx, check = onnxsim.simplify(model_onnx, check_n=3) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 492, in simplify model = fixed_point(model, infer_shapes_and_optimize, constant_folding) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 393, in fixed_point x = func_b(x) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 486, in constant_folding custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 232, in forward_for_node_outputs custom_lib=custom_lib) File "/home/msaha/.local/lib/python3.6/site-packages/onnxsim/onnx_simplifier.py", line 198, in forward ), sess_options=sess_options, providers=['CPUExecutionProvider']) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/msaha/.local/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Mul_948) Op (Mul) [ShapeInferenceError] Incompatible dimensions

Any ideas? Thanks!

请问你解决了吗? 我也是这个问题

Yaoxingtian avatar Apr 01 '22 06:04 Yaoxingtian

我按官方的环境配置新建,没有问题

Yaoxingtian avatar Apr 01 '22 07:04 Yaoxingtian

运行(Run)export_onnx.py,遇到这个问题(Error): 2022-04-09 18:10:09.4948181 [E:onnxruntime:, sequential_executor.cc:346 onnxruntime::SequentialExecutor::Execute] Non-zero status code returned while running Mul node. Name:'Mul_1340' Status Message: D:\a_work\1\s\onnxruntime\core/providers/cpu/math/element_wise_ops.h:505 onnxruntime::BroadcastIterator::Append axis == 1 || axis == largest was false. Attempting to broadcast an axis by a dimension other than 1. 3 by 20

Traceback (most recent call last): File "D:/YOLOP-main/export_onnx.py", line 178, in model_onnx, check = onnxsim.simplify(model_onnx, check_n=3) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 492, in simplify model = fixed_point(model, infer_shapes_and_optimize, constant_folding) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 393, in fixed_point x = func_b(x) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 486, in constant_folding custom_lib=custom_lib) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 232, in forward_for_node_outputs custom_lib=custom_lib) File "C:\Users\86135\AppData\Roaming\Python\Python36\site-packages\onnxsim\onnx_simplifier.py", line 216, in forward outputs, inputs, run_options=run_options))) File "D:\anaconda3\envs\pytorch\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 192, in run return self._sess.run(output_names, input_feed, run_options) onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Mul node. Name:'Mul_1340' Status Message: D:\a_work\1\s\onnxruntime\core/providers/cpu/math/element_wise_ops.h:505 onnxruntime::BroadcastIterator::Append axis == 1 || axis == largest was false. Attempting to broadcast an axis by a dimension other than 1. 3 by 20

不知道怎么解决,求助啊啊啊(I don't kown how to solve this problem, help!!!)

lzm2275965881 avatar Apr 09 '22 10:04 lzm2275965881

sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_mode onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (Mul_558) Op (Mul) [ShapeInferenceError] Incompatible dimensions I ran into exactly this kind of error while running the export_onnx.py, exporting onnx model in this environment:

  • Ubuntu 20.04
  • python 3.8.12
  • onnx 1.10.2
  • onnx-simplifier 0.3.7
  • onnxruntime 1.11.0
  • pytorch 1.10.2+cu113
  • torchvision 0.11.3

Then i try to export in another environment and it magically worked.

  • Windows10
  • python 3.7.6
  • onnx 1.9.0
  • onnx-simplifier 0.3.6
  • onnxruntime 1.7.0
  • pytorch 1.7.1+cu102
  • torchvision 0.8.2

Hope this could help you. 可以试下换个环境跑export_onnx.py,可能有用。

niuniandajiba avatar Apr 20 '22 07:04 niuniandajiba

Thank you @niuniandajiba, @Yaoxingtian ! Let me try ...

sahamitul avatar Apr 21 '22 03:04 sahamitul

Can the repo owners clarify what versions they used for OS, onnx, onnx-simplifier, onnxruntime, anything else needed and is not clear in requirements.txt ? @Riser6

sahamitul avatar Apr 21 '22 17:04 sahamitul

As suggested by @niuniandajiba, I tried:-

onnx 1.9.0 onnx-simplifier 0.3.6 onnxruntime 1.7.0 pytorch 1.7.1+cu102 torchvision 0.8.2

WITH Ubuntu 18.04.6 LTS AND python 3.6.9 (python 3.7.X should be fine too)

and it seems to convert.

sahamitul avatar Apr 22 '22 23:04 sahamitul