onnx icon indicating copy to clipboard operation
onnx copied to clipboard

Protobuf parsing failed

Open codyuan opened this issue 1 year ago • 4 comments

Bug Report

Describe the bug

I'm trying to start an inference session with onnxruntime.InferenceSession after I export my trained model with onnx, both on Linux and Windows. It is usually fine on Linux, but on Windows it is not. In the most of time it reports "Protobuf parsing failed", sometimes I can fix this by regenerating an onnx model in the same way, but it won't work everytime. I wonder is there any way to fix this? It is due to that the onnx file is broken? If it is true, how can I locate the broken part? I will attach the file I generated and the detailed error information for your reference.

System information

  • OS Platform and Distribution : Linux Ubuntu 22.04, Windows 11
  • ONNX version : 1.16.1
  • onnxruntime version : 1.18.0
  • protobuf version : 3.20.1

Reproduction instructions

import onnxruntime
onnxruntime.InferenceSession('nav_model.onnx')

Broken File

nav_model.zip

Detailed Information

Traceback (most recent call last):
  File "/workspace/miniconda3/lib/python3.11/runpy.py", line 198, in _run_module_as_main
    return _run_code(code, main_globals, None,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/lib/python3.11/runpy.py", line 88, in _run_code
    exec(code, run_globals)
  File "/root/.vscode-server/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher/../../debugpy/__main__.py", line 39, in <module>
    cli.main()
  File "/root/.vscode-server/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher/../../debugpy/../debugpy/server/cli.py", line 430, in main
    run()
  File "/root/.vscode-server/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/adapter/../../debugpy/launcher/../../debugpy/../debugpy/server/cli.py", line 284, in run_file
    runpy.run_path(target, run_name="__main__")
  File "/root/.vscode-server/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 321, in run_path
    return _run_module_code(code, init_globals, run_name,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/.vscode-server/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 135, in _run_module_code
    _run_code(code, mod_globals, init_globals,
  File "/root/.vscode-server/extensions/ms-python.python-2023.14.0/pythonFiles/lib/python/debugpy/_vendored/pydevd/_pydevd_bundle/pydevd_runpy.py", line 124, in _run_code
    exec(code, run_globals)
  File "/workspace/rl-agent/rlagent/WtHeroAgent.py", line 144, in <module>
    agent = Agent("level_7", battle_model_path, crypto=0, with_logging=1, max_pool_n=10)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/rl-agent/rlagent/WtHeroAgent.py", line 79, in __init__
    self.agent_task_dict["navigate"] = NavAgent(
                                       ^^^^^^^^^
  File "/workspace/rl-agent/rlagent/task_agents/nav_agent.py", line 65, in __init__
    self.init_agent(crypto, model_path)
  File "/workspace/rl-agent/rlagent/task_agents/nav_agent.py", line 102, in init_agent
    self.session = onnxruntime.InferenceSession(model)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/workspace/miniconda3/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/workspace/miniconda3/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from /workspace/rl-agent/rlagent/model/nav_model.onnx failed:Protobuf parsing failed.

codyuan avatar Oct 10 '24 07:10 codyuan

You should save your model with external weights so that the main file is not bigger than 2Gb.

xadupre avatar Oct 10 '24 09:10 xadupre

sorry, I am a little bit confused about "external weights". And my onnx file is 578kb, which is definitely not bigger than 2Gb

codyuan avatar Oct 10 '24 09:10 codyuan

Sorry for my misunderstanding. I tried to replicate your issue but it works for me. Which version of onnxruntime are you using? Maybe you can use a more recent one.

xadupre avatar Oct 10 '24 10:10 xadupre

Thank you, I will try. I uploaded the onnx file when it caused the error, it seemed broken. But after that I tried to delete it from my project and replace it with the one downloaded from this page (which should be exactly the same one), and the code works. It is strange, I don't know what caused the file broken and when. I need to do more validation.

codyuan avatar Oct 11 '24 09:10 codyuan

I have the same problem and i would like to be sure fine understand: it needs to replace which file onnx by the file nav model ? If NO how to fix my problem indicate behind: i have this MSG : KokoroSpeaker [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from E:\ComfyUIstanAlone\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-kokoro\kokoro-v0_19.onnx failed:Protobuf parsing failed. and this LOG : onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from E:\ComfyUIstanAlone\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-kokoro\kokoro-v0_19.onnx failed:Protobuf parsing failed.

2025-05-16T19:49:07.264919 - Prompt executed in 0.09 seconds

## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"id":"5f21dd95-b477-4d04-86fe-ff735cb1ea7b","revision":0,"last_node_id":6,"last_link_id":7,"nodes":[{"id":5,"type":"PreviewAudio","pos":[3885.45751953125,-38.369808197021484],"size":[315,88],"flags":{},"order":2,"mode":0,"inputs":[{"localized_name":"audio","name":"audio","type":"AUDIO","link":6},{"localized_name":"audioUI","name":"audioUI","type":"AUDIO_UI","widget":{"name":"audioUI"},"link":null}],"outputs":[],"properties":{"cnr_id":"comfy-core","ver":"0.3.32","Node name for S&R":"PreviewAudio"},"widgets_values":[]},{"id":2,"type":"KokoroSpeaker","pos":[3036.4716796875,128.60635375976562],"size":[315,58],"flags":{},"order":0,"mode":0,"inputs":[{"localized_name":"speaker_name","name":"speaker_name","type":"COMBO","widget":{"name":"speaker_name"},"link":null}],"outputs":[{"localized_name":"speaker","name":"speaker","type":"KOKORO_SPEAKER","slot_index":0,"links":[7]}],"properties":{"cnr_id":"comfyui-kokoro","ver":"1.0.1","Node name for S&R":"KokoroSpeaker"},"widgets_values":["af_sarah"]},{"id":4,"type":"KokoroGenerator","pos":[3430.16650390625,56.75017547607422],"size":[400,200],"flags":{},"order":1,"mode":0,"inputs":[{"localized_name":"speaker","name":"speaker","type":"KOKORO_SPEAKER","link":7},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":null},{"localized_name":"speed","name":"speed","type":"FLOAT","widget":{"name":"speed"},"link":null},{"localized_name":"lang","name":"lang","type":"STRING","widget":{"name":"lang"},"link":null}],"outputs":[{"localized_name":"audio","name":"audio","type":"AUDIO","slot_index":0,"links":[6]}],"properties":{"cnr_id":"comfyui-kokoro","ver":"1.0.1","Node name for S&R":"KokoroGenerator"},"widgets_values":["i am now going to show you how to install comfy UI, its very simple and fast.",0.88,"English"]}],"links":[[6,4,0,5,0,"AUDIO"],[7,2,0,4,0,"KOKORO_SPEAKER"]],"groups":[],"config":{},"extra":{"ds":{"scale":1,"offset":[-2923.9002990722656,307.72314453125]},"frontendVersion":"1.18.9","node_versions":{"comfyui-kokoro":"743120d2e5eec4eb4503205f9f4a93d9b997d7f6","comfy-core":"0.3.12"},"VHS_latentpreview":false,"VHS_latentpreviewrate":0,"workspace_info":{"id":"dMK_03qIT8BT3bZVnt3bf"},"VHS_MetadataImage":true,"VHS_KeepIntermediate":true},"version":0.4}


## Additional Context
(Please add any additional context or steps to reproduce the error here)

ABIA2024 avatar May 16 '25 17:05 ABIA2024