Knover icon indicating copy to clipboard operation
Knover copied to clipboard

Convert PLATO-2 model to ONNX format

Open fadelma opened this issue 3 years ago • 6 comments

Hello everyone,

I'm currently trying to convert PLATO-2 model into ONNX format using Paddle2ONNX. However, when I try to convert the NSP model, I got this error:

Traceback (most recent call last):
  File "/usr/local/bin/paddle2onnx", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/command.py", line 184, in main
    input_shape_dict=input_shape_dict)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/command.py", line 148, in program2onnx
    operator_export_type=operator_export_type)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/convert.py", line 84, in program2onnx
    enable_onnx_checker, operator_export_type)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/convert.py", line 34, in export_onnx
    operator_export_type, verbose)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/graph/onnx_graph.py", line 240, in build
    onnx_graph = ONNXGraph(paddle_graph, opset_version=opset_version, operator_export_type=operator_export_type)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/graph/onnx_graph.py", line 79, in __init__
    self.update_opset_version()
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/graph/onnx_graph.py", line 194, in update_opset_version
    self.opset_version = OpMapper.get_recommend_opset_version(node_map, self.opset_version)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/op_mapper/op_mapper.py", line 129, in get_recommend_opset_version
    node_map, opset_version, True)
  File "/usr/local/lib/python3.7/dist-packages/paddle2onnx/op_mapper/op_mapper.py", line 174, in check_support_status
    raise NotImplementedError(error_info)
NotImplementedError: 
There's 1 ops are not supported yet
=========== gather_nd ===========

Is this Paddle2ONNX's issue?

Also, Is there anyone who had successfully converted PLATO-2 model to ONNX format with Paddle2ONNX or other alternative methods and does not mind to share how to do it?

Thank you very much in advance!

fadelma avatar Jan 10 '22 09:01 fadelma

Hello, We have received your question, and we will solve it as soon as possible.

yeliang2258 avatar Jan 11 '22 02:01 yeliang2258

Hi, please pull this PR for conversion test, thank you. https://github.com/PaddlePaddle/Paddle2ONNX/pull/479

yeliang2258 avatar Jan 11 '22 03:01 yeliang2258

Hello @yeliang2258, I got this error when trying to convert the NSP model of PLATO-2 32L

Hello, I got this error when trying to convert the NSP model of PLATO-2 32L

2022-01-11 04:33:16 [INFO]      ONNX model generated is valid.
Traceback (most recent call last):
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/utils.py", line 43, in check_model
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/onnx/checker.py", line 101, in check_model
    protobuf_string = model.SerializeToString()
ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 6530750548

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/mnt/data3/shared_envs/paddle_env/bin/paddle2onnx", line 33, in <module>
    sys.exit(load_entry_point('paddle2onnx==0.9.0', 'console_scripts', 'paddle2onnx')())
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/command.py", line 187, in main
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/command.py", line 151, in program2onnx
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/convert.py", line 83, in program2onnx
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/convert.py", line 37, in export_onnx
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/graph/onnx_graph.py", line 234, in export_proto
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/utils.py", line 45, in check_model
Exception: ONNX model is not valid.

fadelma avatar Jan 11 '22 04:01 fadelma

@fadelma The onnx checker will be failed while model_size > 2GB, we haven't test conversion on such large model, please try to disable onnx checker while converting paddle to onnx.

jiangjiajun avatar Jan 11 '22 06:01 jiangjiajun

Hello @jiangjiajun and @yeliang2258. I have disabled the onnx checker, but then I got this error

Traceback (most recent call last):
  File "/mnt/data3/shared_envs/paddle_env/bin/paddle2onnx", line 33, in <module>
    sys.exit(load_entry_point('paddle2onnx==0.9.0', 'console_scripts', 'paddle2onnx')())
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/command.py", line 187, in main
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/command.py", line 151, in program2onnx
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/convert.py", line 83, in program2onnx
  File "/mnt/data3/shared_envs/paddle_env/lib/python3.9/site-packages/paddle2onnx-0.9.0-py3.9.egg/paddle2onnx/convert.py", line 46, in export_onnx
ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 6530750548

fadelma avatar Jan 11 '22 07:01 fadelma

Please send a same issue to Paddle2ONNX, this will take some effort to solve

jiangjiajun avatar Jan 11 '22 07:01 jiangjiajun