TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

convert onnx model to tensorrt mode failure of TensorRT 8.6, grid sample op

Open milely opened this issue 2 years ago • 6 comments

Description

When I tried to convert the onnx model to a TRT model, I was using the TensorRT8.6 version, and the opset of onnx was 17. The operation of grid sample in the model encountered an error during conversion.

image image image

Can anyone help answer this question?

Environment

TensorRT Version: 8.6.1

NVIDIA GPU: 3090

NVIDIA Driver Version:

CUDA Version: 11.1

CUDNN Version:

Operating System:

Python Version (if applicable):

Tensorflow Version (if applicable):

PyTorch Version (if applicable):

Baremetal or Container (if so, version):

Relevant Files

Model link:

Steps To Reproduce

Commands or scripts:

Have you tried the latest release?:

Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (polygraphy run <model.onnx> --onnxrt):

milely avatar Oct 24 '23 06:10 milely

You log is incomplete, in the later part of the error log should tell you why the onnx parse fail.

zerollzeng avatar Oct 25 '23 13:10 zerollzeng

You log is incomplete, in the later part of the error log should tell you why the onnx parse fail. Here is the complete log. It seems to be saying that the plugin for GridSample was not found. I saw on the official website that version 8.6 of TRT supports grid sample operation, but the conversion was not successful. Can you help answer what the problem might be?

[10/25/2023-21:30:50] [W] [TRT] onnx2trt_utils.cpp:369: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. [10/25/2023-21:30:50] [W] [TRT] onnx2trt_utils.cpp:395: One or more weights outside the range of INT32 was clamped [10/25/2023-21:30:50] [I] [TRT] No importer registered for op: GridSample. Attempting to import as plugin. [10/25/2023-21:30:50] [I] [TRT] Searching for plugin: GridSample, plugin_version: 1, plugin_namespace: [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:773: While parsing node number 114 [GridSample -> "/face_warp/GridSample_output_0"]: [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:774: --- Begin node --- [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:775: input: "/Transpose_1_output_0" input: "/face_warp/Concat_3_output_0" output: "/face_warp/GridSample_output_0" name: "/face_warp/GridSample" op_type: "GridSample" attribute { name: "align_corners" i: 0 type: INT } attribute { name: "mode" s: "bilinear" type: STRING } attribute { name: "padding_mode" s: "zeros" type: STRING }

[10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:776: --- End node --- [10/25/2023-21:30:50] [E] [TRT] ModelImporter.cpp:779: ERROR: builtin_op_importers.cpp:4890 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?" [10/25/2023-21:30:50] [E] Failed to parse onnx file [10/25/2023-21:30:50] [I] Finished parsing network model. Parse time: 0.393404 [10/25/2023-21:30:50] [E] Parsing model failed [10/25/2023-21:30:50] [E] Failed to create engine from model or file. [10/25/2023-21:30:50] [E] Engine set up failed

image

milely avatar Oct 25 '23 13:10 milely

Does the model work with onnxruntime? GridSample should be a supported op. see https://github.com/onnx/onnx-tensorrt/blob/8.6-GA/docs/operators.md

Please provide a reproduce if it work with onnxruntime.

zerollzeng avatar Oct 30 '23 15:10 zerollzeng

Guess you are using model from mmlab. Try to convert with '--staticPlugins=xx/libmmdeploy_tensorrt_ops.so'.

monsterlyg avatar Nov 06 '23 06:11 monsterlyg

I did not use the model in mmlab, but I used the bilinear_grid_sample operator implemented in mmcv to replace the grid sample operator and solved this problem.

milely avatar Nov 06 '23 06:11 milely

Guess you are using model from mmlab. Try to convert with '--staticPlugins=xx/libmmdeploy_tensorrt_ops.so'.

hi,i use mmdeploy to convert model,but there is no this params? usage: deploy.py [-h] [--test-img TEST_IMG [TEST_IMG ...]] [--work-dir WORK_DIR] [--calib-dataset-cfg CALIB_DATASET_CFG] [--device DEVICE] [--log-level {CRITICAL,FATAL,ERROR,WARN,WARNING,INFO,DEBUG,NOTSET}] [--show] [--dump-info] [--quant-image-dir QUANT_IMAGE_DIR] [--quant] [--uri URI] deploy_cfg model_cfg checkpoint img deploy.py: error: unrecognized arguments: --static-plugins=/home/admin/miniconda3/lib/python3.10/site-packages/mmdeploy/lib/libmmdeploy_tensorrt_ops.so

jinqiua avatar Aug 07 '24 09:08 jinqiua