TensorRT
                                
                                 TensorRT copied to clipboard
                                
                                    TensorRT copied to clipboard
                            
                            
                            
                        trtexec is not installed in docker containers - Quickstart samples are broken
Description
In my understanding, it is intended to use one of the provided dockerfiles from a release, build it and then run tensor-rt inside. However, I've tried several releases (8.4.3, 21.07, 21.10), built the dockerfiles and started them, but not able to run trtexec from inside, which is extremely confusing, as it is the one thing I was expecting from these docker containers.
My original idea was to run the container and execute the quickstart sample 02, however the kernel keeps crashing with
2022-09-02 15:14:18.605122: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: /usr/local/nvidia/lib:/usr/local/nvidia/lib64:/workspace/TensorRT/build/out:/usr/lib/x86_64-linux-gnu
2022-09-02 15:14:18.605177: F tensorflow/compiler/tf2tensorrt/stub/nvinfer_stub.cc:49] getInferLibVersion symbol not found.
so I wanted to try Quickstart Sample 01, which includes the command
trtexec --onnx=resnet50/model.onnx --saveEngine=resnet_engine_intro.trt  --explicitBatch
which fails as trtexec is not found.
So it seems that none of the quickstart samples can be executed out-of-the-box. What are the necessary steps to get this TensorRT in a docker container up and running. And what is the relationship between the containers from this Repo and https://catalog.ngc.nvidia.com/orgs/nvidia/containers/tensorrt?
BTW: Can you update the older releases to contain the bugfix from https://github.com/NVIDIA/TensorRT/pull/2059? I always have to monkey-patch the dockerfiles or I'm not able to build the docker files in the first place.
Environment
TensorRT Version: 8.0.3.4 NVIDIA GPU: NVIDIA GeForce RTX 2080 NVIDIA Driver Version: 470.141.03 CUDA Version: 11.4 CUDNN Version: Operating System: Ubuntu 18.04 Python Version (if applicable): Tensorflow Version (if applicable): PyTorch Version (if applicable): Baremetal or Container (if so, version):
Steps To Reproduce
- Clone this repository and checkout a tag like 20.07
- Run ./docker/build.sh --file docker/ubuntu-18.04.Dockerfile --tag tensorrt-ubuntu18.04-cuda11.3 --cuda 11.3.1
- Run ./docker/launch.sh --tag tensorrt-ubuntu18.04-cuda11.3 --gpus all
- Inside of the docker container execute trtexec
Expected result: execution of trtexec
Actual result: bash: trtexec: command not found
@rajeevsrao @kevinch-nv Can you help check it ^ ^
@apacha For all my dockerfiles I add ENV PATH="/usr/src/tensorrt/bin:${PATH}" and it works. trtexec is installed in /usr/src/tensorrt/bin, just not linked. Should work after that.