kapao icon indicating copy to clipboard operation
kapao copied to clipboard

how to export to onnx?

Open xinsuinizhuan opened this issue 1 year ago • 15 comments

xinsuinizhuan avatar Jul 28 '22 13:07 xinsuinizhuan

I have converted the model to onnx success: torch.onnx.export(model, img, './test.onnx', verbose=True, opset_version=opset_version, input_names=input_names, output_names=output_names, dynamic_axes=dynamic_axes)

ykk648 avatar Jul 29 '22 08:07 ykk648

I have converted the model to onnx success: torch.onnx.export(model, img, './test.onnx', verbose=True, opset_version=opset_version, input_names=input_names, output_names=output_names, dynamic_axes=dynamic_axes)

When I export pt to onnx, has this error, Can you tell me how you converted the model to onnx success. RuntimeError: Exporting the operator silu to ONNX opset version 11 is not supported. Please open a bug to request ONNX export support for the missing operator.

xddlj avatar Aug 24 '22 02:08 xddlj

@xddlj try opset_version = 13

ykk648 avatar Aug 25 '22 01:08 ykk648

I have converted the model to onnx success: torch.onnx.export(model, img, './test.onnx', verbose=True, opset_version=opset_version, input_names=input_names, output_names=output_names, dynamic_axes=dynamic_axes)

请问函数里的参数应该怎么写呢?官方给的pt不知道输入输出shape和名字该怎么转

PaulX1029 avatar Nov 11 '22 07:11 PaulX1029

def torch2onnx(model_, input_, output_name="./test.onnx"):
    input_names = ["input_1"]
    output_names = ["output_1"]
    opset_version = 13
    dynamic_axes = None
    # dynamic_axes = {'input_1': [0, 2, 3], 'output_1': [0, 1]}
    torch.onnx.export(model_, input_, output_name, verbose=True, opset_version=opset_version,
                      input_names=input_names, output_names=output_names,
                      dynamic_axes=dynamic_axes, do_constant_folding=True)
    raise 'convert done !'

@PaulX1029

ykk648 avatar Nov 11 '22 07:11 ykk648

转换的官方的kapao_s_coco.pt吗,我按照您的代码,转换提示这个错误: Traceback (most recent call last): File "/mnt/sda/AI/kapao-master/export_xzw.py", line 18, in <module> torch2onnx(model_path, img, output_name) File "/mnt/sda/AI/kapao-master/export_xzw.py", line 11, in torch2onnx dynamic_axes=dynamic_axes, do_constant_folding=True) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/__init__.py", line 276, in export custom_opsets, enable_onnx_checker, use_external_data_format) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/utils.py", line 94, in export use_external_data_format=use_external_data_format) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/utils.py", line 676, in _export with select_model_mode_for_export(model, training): File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/contextlib.py", line 112, in __enter__ return next(self.gen) File "/mnt/sda/AI/miniconda3/envs/yolov5/lib/python3.7/site-packages/torch/onnx/utils.py", line 38, in select_model_mode_for_export is_originally_training = model.training AttributeError: 'str' object has no attribute 'training'

PaulX1029 avatar Nov 11 '22 08:11 PaulX1029

image @ykk648

PaulX1029 avatar Nov 11 '22 08:11 PaulX1029

对不起,我误会了您的意思,需要用torch框架先把模型加载进来吧?

PaulX1029 avatar Nov 11 '22 08:11 PaulX1029

@PaulX1029

ykk648 avatar Nov 14 '22 02:11 ykk648

最好官方能出个export.py的脚本

xinsuinizhuan avatar Nov 14 '22 15:11 xinsuinizhuan

I converted the model to ONNX with following options:

im = torch.randn(1, 3, 640, 640).type_as(next(model.parameters()))


torch.onnx.export(
        model.cpu(),
        im.cpu(),
        "kapao.onnx",
        verbose=False,
        opset_version=12,
        do_constant_folding=True,  
        input_names=['images'],
        output_names=['output'],
        dynamic_axes=None)

Conversion seems to be successful. But when i load the model for inference using onnxruntime i get error:

session = ort.InferenceSession(model_path)

Error:

onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from kapao.onnx failed:Node (Mul_2329) Op (Mul) [ShapeInferenceError] Incompatible dimensions

Was someone able to do inference using onnx runtime ?

nikhilchh avatar Jul 13 '23 16:07 nikhilchh

https://github.com/ykk648/AI_power/blob/main/body_lib/body_kp_detector/body_kp_detector_kapao/body_kp_detector_kapao.py

ykk648 avatar Jul 14 '23 09:07 ykk648

@ykk648

Going through your dependencies to find where exactly you do "onnxruntime.InferenceSession(model_path)"

But I could not find where is the code for ModelBase:

'from ...model_base import ModelBase'

nikhilchh avatar Sep 06 '23 19:09 nikhilchh

I found some changes that were done to yolov5 github to handle this issue:

https://github.com/ultralytics/yolov5/pull/2982

I guess this is what is the issue during the inference.

nikhilchh avatar Sep 06 '23 21:09 nikhilchh

@nikhilchh https://github.com/ykk648/apstone/blob/main/apstone/wrappers/onnx_wrapper/onnx_model.py https://github.com/ykk648/apstone/blob/main/apstone/model_base.py

ykk648 avatar Sep 07 '23 01:09 ykk648