TensorRT
                                
                                 TensorRT copied to clipboard
                                
                                    TensorRT copied to clipboard
                            
                            
                            
                        Expose additional ONNX protobuf model properties via nvonnxparser::IParser
When parsing an ONNX model to build a TRT network, it would be useful to expose the additional parameters from the ONNX protobuf. Currently only the graph is exposed.
auto onnx_parser = unique_ptr<nvonnxparser::IParser>(nvonnxparser::createParser(*network, getLogger()));
onnx_parser->parseFromFile(onnx_path.c_str(), static_cast<int>(nvinfer1::ILogger::Severity::kINFO)))
These additional properties include:
- doc_string
- domain
- ir_version
- metadata_props
- model_version
- opset_import
- producer_name
- producer_version
Often the metadata_props contain important information describing how to execute/interpret the model, for example the class names and preprocessing steps.
In python, reading these parameters from the ONNX is straight forward, but via C++ it’s much more involved, having the compile/include the ONNX repo and avoid issues with protobuf size limits. As the ONNX is already being parsed by TRT, it would be straight forward to expose these properties somewhere on this interface.
I would create a PR for this myself, but I believe the code is closed source. Let me know if it’s not.
@kevinch-nv ^ ^
Could you raise the question in https://github.com/onnx/onnx-tensorrt/issues, I will close this, thanks!