onnx-tensorrt icon indicating copy to clipboard operation
onnx-tensorrt copied to clipboard

nvonnxparser throwing std::out_of_range error

Open CarlPoirier opened this issue 3 years ago • 3 comments

Description

Hi,

I'm trying to create a TRT engine from any ONNX file that I have trained and exported with PyTorch.

So I'm using this command: trtexec.exe --onnx=raccoon_mobilenet_opset13.onnx --workspace=2048 --saveEngine=test.engine --shapes=input:1x3x650x417 --verbose --explicitBatch

When doing so, it just exits abruptly. The last verbose lines are: [05/27/2021-14:46:29] [V] [TRT] ModelImporter.cpp:103: Parsing node: Squeeze_3 [Squeeze] [05/27/2021-14:46:29] [V] [TRT] ModelImporter.cpp:119: Searching for input: 286 [05/27/2021-14:46:29] [V] [TRT] ModelImporter.cpp:119: Searching for input: 287 [05/27/2021-14:46:29] [V] [TRT] ModelImporter.cpp:125: Squeeze_3 [Squeeze] inputs: [286 -> (-1, -1, -1, -1)], [287 -> (1)],

So I ran trtexec in Visual Studio and I have seen it throws std::out_of_range error when calling: if (!parser.onnxParser->parseFromFile(model.baseModel.model.c_str(), static_cast<int>(sample::gLogger.getReportableSeverity())))

Is this a known issue? In any case, would any of you have a clue how to go around this?

Thanks,

Carl

Environment

TensorRT Version: 7.2.3.4 NVIDIA GPU: GTX 1050 Ti NVIDIA Driver Version: 461.72 CUDA Version: 11.1 CUDNN Version: 8.0 Operating System: Windows 10 x64 Python Version (if applicable): 3.8.5 Tensorflow Version (if applicable): PyTorch Version (if applicable): 1.9.0.dev20210526+cu111 (needed for exporting hardsigmoid to onnx) Baremetal or Container (if so, version): baremetal, python virtual environments

Relevant Files

First ONNX model is here. It's an object detection model with a mobilenetv3 backbone.

Second ONNX model is here. It's an object detection model with a resnet50 backbone.

Steps To Reproduce

  1. Download my model file
  2. execute trtexec as mentioned above

CarlPoirier avatar May 27 '21 19:05 CarlPoirier

I was able to go over the original assert by exporting the model with different parameters for the batch size and it makes more sense now. I updated the linked file. Now I get a different error which is a throw of std::out_of_range in nvonnxparser. I have updated the original post.

CarlPoirier avatar Jun 01 '21 14:06 CarlPoirier

Actually, this is an issue when using opset 13 only. If I export using opset 11, then I don't get the std::out_of_range.

CarlPoirier avatar Jun 01 '21 15:06 CarlPoirier

Thanks for narrowing it down to a opset 13 issue. I'll take a look to see what's going on.

kevinch-nv avatar Jun 07 '21 18:06 kevinch-nv