DNN-bench
DNN-bench copied to clipboard
Error when benchmarking ort-tensorrt with RTX 3090
Hi,
When running benchmarking with GPU and ort_tensorrt backend: ./bench_model.sh ./stsb-xlm-r-multilingual.bert.opt.onnx --repeat=100 --number=1 --warmup=10 --device=gpu --ort-tensorrt, I got the following error:
onnxruntime-tensorrt
=====================
== NVIDIA TensorRT ==
=====================
NVIDIA Release 20.01 (build 9719032)
NVIDIA TensorRT 7.0.0 (c) 2016-2019, NVIDIA CORPORATION. All rights reserved.
Container image (c) 2019, NVIDIA CORPORATION. All rights reserved.
https://developer.nvidia.com/tensorrt
To install Python sample dependencies, run /opt/tensorrt/python/python_setup.sh
To install open source parsers, plugins, and samples, run /opt/tensorrt/install_opensource.sh. See https://github.com/NVIDIA/TensorRT/tree/20.01 for more information.
WARNING: Detected NVIDIA GeForce RTX 3090 GPU, which is not yet supported in this version of the container
ERROR: No supported GPU(s) detected to run this container
Any help ?
Thanks! Matthieu
Try docker pull mcr.microsoft.com/azureml/onnxruntime:latest-tensorrt and re-run.
I am not sure if Microsoft is regularly updating these images. Otherwise you can compile and build the image yourself:
https://github.com/microsoft/onnxruntime/blob/master/dockerfiles/Dockerfile.tensorrt