TensorRT
TensorRT copied to clipboard
ONNX inference results are different from Pytorch inference results
Refer to https://github.com/NVIDIA/TensorRT/tree/master/tools/pytorch-quantization/examples
- calibrate_quant_resnet18
- finetune_quant_resnet18
- export to onnx But the ONNX inference results are different from Pytorch inference results:
Mismatched elements: 980 / 1000 (98%) Max absolute difference: 0.11406998 Max relative difference: 33.32787 x: array([[ 8.407020e-02, 2.851861e+00, 2.868625e+00, 3.110754e+00, 5.232163e+00, 4.126016e+00, 4.256042e+00, 5.695968e-01, -8.674262e-01, -1.964558e-01, -1.039050e-01, 1.224846e+00,... y: array([[ 1.649464e-01, 2.892797e+00, 2.856432e+00, 3.168223e+00, 5.266732e+00, 4.117697e+00, 4.239751e+00, 5.884879e-01, -8.712382e-01, -1.605508e-01, -1.481664e-01, 1.232234e+00,...
why?
https://github.com/NVIDIA/TensorRT/issues/2103#issuecomment-1170081821
Your problem seems similar to https://github.com/NVIDIA/TensorRT/issues/2103. Does it apply to your case?
Your problem seems similar to #2103. Does it apply to your case?
It doesn't seem to apply, I mean onnx and pytorch inference results are not the same, not onnx and trt。
@yitian8377 is the issue still exist? we have test to evaluate the accuracy for resnet18, see https://github.com/NVIDIA/TensorRT/blob/release/8.6/tools/pytorch-quantization/tests/classification_flow_test.py thanks!
closing legacy issues, please reopen if you still have issue with latest TRT, thanks!