onnx-tensorrt
onnx-tensorrt copied to clipboard
Error when compiling sampleOnnxMNIST with TensorRT 8.2.3-1
Description
In TensorRT 8.2.3-1, IParser class has now a protected destructor.
This causes an error when compiling sampleOnnxMNIST
Compiling: sampleOnnxMNIST.cpp In file included from ../common/buffers.h:20, from sampleOnnxMNIST.cpp:27: ../common/common.h: In instantiation of ‘void samplesCommon::InferDeleter::operator()(T*) const [with T = nvonnxparser::IParser]’: /usr/include/c++/9/bits/unique_ptr.h:292:17: required from ‘std::unique_ptr<_Tp, _Dp>::~unique_ptr() [with _Tp = nvonnxparser::IParser; _Dp = samplesCommon::InferDeleter]’ sampleOnnxMNIST.cpp:124:118: required from here ../common/common.h:388:9: error: ‘virtual nvonnxparser::IParser::~IParser()’ is protected within this context 388 | delete obj; | ^~~~~~ In file included from ../common/parserOnnxConfig.h:26, from sampleOnnxMNIST.cpp:30: /usr/local/include/NvOnnxParser.h:236:13: note: declared protected here 236 | virtual ~IParser() {} | ^ make: *** [../Makefile.config:349: ../../bin/dchobj/sampleOnnxMNIST/sampleOnnxMNIST/sampleOnnxMNIST.o] Error 1
Environment
TensorRT 8.2 GA Update 2 for x86_64 Architecture taken from https://developer.nvidia.com/nvidia-tensorrt-8x-download
Can you give some more information on how you have TensorRT installed? I'm unable to repro the issue locally.
My steps:
Pull and run 22.02 TensorRT docker image:
docker pull nvcr.io/nvidia/tensorrt:22.01-py3
nvidia-docker run -it --rm nvcr.io/nvidia/tensorrt:22.01-py3
Download the TRT tar package and copy it over `docker cp TensorRT-8.2.3.0.Linux.x86_64-gnu.cuda-11.4.cudnn8.2.tar.gz <image_name>:/workspace
Unzip and build samples
tar -xf TensorRT-8.2.3.0.Linux.x86_64-gnu.cuda-11.4.cudnn8.2.tar.gz
cd TensorRT-8.2.3.0/samples
make -j
All the samples (including sampleONNXMNIST) build as expected
I installed TensorRT in Ubuntu 20.04 with
sudo apt-get install tensorrt
which installed the most recent version 8.2.3.0-1+cuda11.4
Have you tried building with the tar method?