Paddle2ONNX icon indicating copy to clipboard operation
Paddle2ONNX copied to clipboard

导出的onnx与原模型结果存在较大差异

Open wzz981 opened this issue 1 year ago • 0 comments

请将下面信息填写完整,便于我们快速解决问题,谢谢!

问题描述 使用paddle2onnx导出模型

语句为:

paddle2onnx --model_dir model --model_filename inference.pdmodel --params_filename inference.pdiparams --save_file model.onnx --enable_dev_version True

中间信息为:

[Paddle2ONNX] Start to parse PaddlePaddle model...
[Paddle2ONNX] Model file path: model\inference.pdmodel
[Paddle2ONNX] Paramters file path: model\inference.pdiparams
[Paddle2ONNX] Start to parsing Paddle model...
[Paddle2ONNX] Use opset_version = 12 for ONNX export.
[Paddle2ONNX] Find dumplicate output name 'fill_constant_47.tmp_0', it will rename to 'p2o.fill_constant_47.tmp_0.0'.
[Paddle2ONNX] Find dumplicate output name 'p2o.fill_constant_47.tmp_0.0', it will rename to 'p2o.p2o.fill_constant_47.tmp_0.0.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_0', it will rename to 'p2o.eager_tmp_0.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_1', it will rename to 'p2o.eager_tmp_1.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_2', it will rename to 'p2o.eager_tmp_2.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_3', it will rename to 'p2o.eager_tmp_3.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_4', it will rename to 'p2o.eager_tmp_4.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_5', it will rename to 'p2o.eager_tmp_5.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_6', it will rename to 'p2o.eager_tmp_6.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_7', it will rename to 'p2o.eager_tmp_7.0'.
[Paddle2ONNX] Find dumplicate output name 'eager_tmp_8', it will rename to 'p2o.eager_tmp_8.0'.
[Paddle2ONNX] PaddlePaddle model is exported as ONNX format now.
2023-05-18 14:33:25 [INFO]      ===============Make PaddlePaddle Better!================
2023-05-18 14:33:25 [INFO]      A little survey: https://iwenjuan.baidu.com/?code=r8hu2s

能够进行推理,但存在明显差异

更多信息 :

  • 用于部署的推理引擎: onnxruntime、openvino
  • 为什么需要转换为ONNX格式:部署
  • Paddle2ONNX版本: 1.0.6
  • 你的联系方式(Email/Wechat/Phone): [email protected]

报错截图

diff = outs[0] - out.numpy()
max_abs_diff = np.fabs(diff).max()
if max_abs_diff < 1e-05:
    print("The difference of results between ONNXRuntime and Paddle looks good!")
else:
    relative_diff = max_abs_diff / np.fabs(out.numpy()).max()
    if relative_diff < 1e-05:
        print("The difference of results between ONNXRuntime and Paddle looks good!")
    else:
        print("The difference of results between ONNXRuntime and Paddle looks bad!")
    print('relative_diff: ', relative_diff)
print('max_abs_diff: ', max_abs_diff)

运行上面对比差异程序有以下情况

使用paddle.inference进行推理与正常模型对比信息:

The difference of results between ONNXRuntime and Paddle looks good!
max_abs_diff:  0.0

使用导出的onnx模型通过ONNXRuntime推理与正常模型对比信息:

The difference of results between ONNXRuntime and Paddle looks bad!
relative_diff:  0.24361868
max_abs_diff:  0.26957464

使用openvino加载compile_model(model=onnx_model_dir, device_name="CPU")能够执行推理 使用导出的onnx模型通过openvino推理与正常模型对比信息:

The difference of results between ONNXRuntime and Paddle looks bad!
relative_diff:  1.0165321
max_abs_diff:  1.1248369

使用openvino加载ie.read_model(model=model_file_path, weights=params_file_path)出现下面报错

RuntimeError: Check 'creator_it != CREATORS_MAP.end()' failed at C:\Jenkins\workspace\private-ci\ie\build-windows-vs2019\b\repos\openvino\src\frontends\paddle\src\frontend.cpp:45:
FrontEnd API failed with OpConversionFailure: :
No creator found for set_value node.

其他信息

wzz981 avatar May 18 '23 07:05 wzz981