onnx-tensorrt icon indicating copy to clipboard operation
onnx-tensorrt copied to clipboard

ERROR: onnx2trt_utils.cpp:1708 In function unaryHelper: [8] Assertion failed: validUnaryType

Open chadrick-kwag opened this issue 3 years ago • 1 comments

Description

Encountered the following error message when converting onnx model to trt.

[12/17/2021-13:48:24] [V] [TRT] ModelImporter.cpp:179: Add_68 [Add] outputs: [247 -> (1, 512, 512)], 
[12/17/2021-13:48:24] [V] [TRT] ModelImporter.cpp:103: Parsing node: Abs_69 [Abs]
[12/17/2021-13:48:24] [V] [TRT] ModelImporter.cpp:119: Searching for input: 240
[12/17/2021-13:48:24] [V] [TRT] ModelImporter.cpp:125: Abs_69 [Abs] inputs: [240 -> (1, 512, 512)], 
ERROR: onnx2trt_utils.cpp:1708 In function unaryHelper:
[8] Assertion failed: validUnaryType
[12/17/2021-13:48:24] [E] Failed to parse onnx file
[12/17/2021-13:48:24] [E] Parsing model failed
[12/17/2021-13:48:24] [E] Engine creation failed
[12/17/2021-13:48:24] [E] Engine set up failed

The onnx model has 4 inputs and 2 outputs. no dynamic sizes were used when creating the onnx model from pytorch.

It looks like it failed when parsing Abs_69 node. The inut to Abs_69 node is and output from Sub operation, and it is int64 type tensors, according to my inspection with Netron.

I am suspecting this is caused by this line: think this line caused the error: https://github.com/onnx/onnx-tensorrt/blob/803e699a22a2f92ae0c004a812367110fbf34548/onnx2trt_utils.cpp#L1842

but I don't know why parsing Abs operation is not allowed to work with int types.

the onnx model is generated from pytorch 1.9.0, with opset 11

Environment

TensorRT Version: 7.2.3.4 ONNX-TensorRT Version / Branch: ? GPU Type: Tesla V100 Nvidia Driver Version: 470.57.02 CUDA Version: 11.0 CUDNN Version: 8.1.0 Operating System + Version: CentOS 7 Python Version (if applicable): python3.9 TensorFlow + TF2ONNX Version (if applicable): PyTorch Version (if applicable): 1.9.0 Baremetal or Container (if container which image + tag): Baremetal

Relevant Files

Unfortunately I cannot share the onnx file.

Steps To Reproduce

with the onnx file I ran the following command

$ trtexec --onnx=$ONNX_FILE --saveEngine=$OUTPUT --explicitBatch --workspace=5000 --device=0 --verbose

chadrick-kwag avatar Dec 17 '21 05:12 chadrick-kwag

This should be fixed on the latest main branch for TRT 8.2. Can you update your TensorRT version and try to parse your model again?

kevinch-nv avatar Mar 21 '22 18:03 kevinch-nv