YOLOX-ROS
YOLOX-ROS copied to clipboard
Issue with YOLOX-ROS TensorRT Conversion: CUDA Compatibility Error
Hello,
I am working on converting the YOLOX Tiny model to TensorRT format using the YOLOX-ROS package. However, I encountered a CUDA compatibility error during the conversion process. The command I used was:
./src/YOLOX-ROS/weights/tensorrt/convert.bash yolox_tiny 16
The script seemed to initiate correctly, but then it terminated with the following error message:
Cuda failure: forward compatibility was attempted on non supported HW
./src/YOLOX-ROS/weights/tensorrt/convert.bash: line 37: 8669 Aborted (core dumped) /usr/src/tensorrt/bin/trtexec --onnx=$SCRIPT_DIR/../onnx/$MODEL.onnx --saveEngine=$SCRIPT_DIR/$MODEL.trt --fp16 --verbose --workspace=$((1<<$TRT_WORKSPACE))
This error suggests a compatibility issue with my hardware and CUDA. I am wondering if you could provide some insights or guidance on how to resolve this issue. Here are some details about my setup:
Any help or suggestions to fix this compatibility issue would be greatly appreciated. Thank you for your time and assistance.
First of all you should tell me what environment you are using.
- Ubuntu version
- CUDA(-toolkit) version
- TensorRT version
- GPU (RTX4090? Jetson Orin?)
Errors during conversion are usually due to GPU, CUDA and TensorRT version mismatch.
Thanks for letting me know.
- Ubuntu version:22.04
- CUDA(-toolkit) version:11.4
- TensorRT version:8.4
- GPU :RTX3090
The following shows the environment in which I have worked at hand. Please refer
- Ubuntu22.04
- GPU-driver : 525.89.02
- CUDA : CUDA-11.8
- TensorRT (trtexec) : v8.5.1
- GPU : RTX3060