Zero Zeng
Zero Zeng
TRT supports multiple plugin outputs, please check your code or refer to our OSS plugin implementation.
> Can different types of multiple outputs also be supported? I only see the same type of reference code Yes, it's supported for [IPluginV2IOExt](https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/classnvinfer1_1_1_i_plugin_v2_i_o_ext.html) and [IPluginV2DynamicExt](https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/c_api/classnvinfer1_1_1_i_plugin_v2_dynamic_ext.html).
> don't know what's the problem @zerollzeng I can't see the root cause with only those logs, can you share your plugin code here?
> Registering tensor: onnx::Reshape_571 for ONNX tensor: onnx::Reshape_571 Is your plugin's output tensor is the input of a reshape node?
`export LD_LIBRARY_PATH=/usr/local/TensorRT-8.2.5.1/lib:$LD_LIBRARY_PATH` and try again?
I'm not an expert on the Linux env setup so I can't answer this question, but I would like to learn something here :-)
Looks like a usage issue, very likely somewhere in your code is wrong. can you provide a reproduce?
Do you mean you can reproduce it with trtexec? Can you share the onnx with us? there should be a bug in TRT we need to investigate, it should never...
> You need to merge these three ONNX files , https://github.com/ywfwyht/onnx_model Can you upload it to Google Drive, or use Git LFS to add those file to your repo?
Okay, Can you tell me how to merge these sub-onnx model?