Smart_Construction icon indicating copy to clipboard operation
Smart_Construction copied to clipboard

onnx export can work

Open zackxue opened this issue 4 years ago • 1 comments

python ./models/export.py --weights ./weights/helmet_head_person_s.pt --img 640 --batch 1

File "demo_onnx.py", line 301, in detections = detect_onnx(official=False, image_path=image_path) File "demo_onnx.py", line 180, in detect_onnx session = onnxruntime.InferenceSession('./weights/helmet_head_person_s.onnx') File "C:\Users\vision\Anaconda3\envs\pt_gpu\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 283, in init self._create_inference_session(providers, provider_options, disabled_optimizers) File "C:\Users\vision\Anaconda3\envs\pt_gpu\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 310, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from ./weights/helmet_head_person_s.onnx failed:Node (Mul_882) Op (Mul) [ShapeInferenceError] Incompatible dimensions

when use the yolov5s.pt, it can work. python ./models/export.py --weights ./weights/yolov5s.pt --img 640 --batch 1 tested by below demo_onnx.zip

When use the official model, set the official True detections = detect_onnx(official=True, image_path=image_path)

zackxue avatar Jul 05 '21 03:07 zackxue

Sorry, the title should be onnx export can't work

zackxue avatar Jul 05 '21 03:07 zackxue

我也遇到了同样的问题,用yolov5官方的export.py就解决了。

FunJoo avatar Nov 23 '22 07:11 FunJoo