onnx-tensorrt icon indicating copy to clipboard operation
onnx-tensorrt copied to clipboard

Deconvolution with provided output shape inconsistency between TensorRT and ONNXRuntime

Open h6197627 opened this issue 1 month ago • 0 comments

Description

There is inconsistency between outputs produced by TensorRT and ONNXRuntime for Deconvolution op with manually specified output shape ONNX model (single ConvTranspose layer)

Compare python script, that load ONNX model, creates TensorRT engine for it, then runs both models with reference input and compares outputs. Output shape is correct, but values are different.

After some investigation it turned out that current TensorRT output is the same as ONNXRuntime before 1.14.0 version. Probably some of these ONNXRuntime commits changed behavior closer to ONNX standard: https://github.com/microsoft/onnxruntime/commit/6246662b1d1b0ba33a9eebfcc426397f62335f82 https://github.com/microsoft/onnxruntime/commit/f96f2225262ed9aaa17604aeb3185b98c5dc71d2

Also checked that OpenVINO output for this model is the same as in current ONNXRuntime version.

Environment

TensorRT Version: 10.13.3.9 ONNX-TensorRT Version / Branch: 10.13 GPU Type: GeForce RTX 4090 Nvidia Driver Version: 580.95.05 CUDA Version: 12.9 Operating System + Version: Ubuntu 22.04 Python Version: 3.10.12

Relevant Files

ONNX model Compare python script

Steps To Reproduce

Put attached script and model into same folder and run: python3 cmp.py

h6197627 avatar Oct 29 '25 17:10 h6197627