tensorrtx
tensorrtx copied to clipboard
The yolov8n inference and post-processing time increases at the time of processing up to the 20th image and remains
Env
- GPU, 1660Ti, laptop
- OS, win11
- Cuda 11.8
- TensorRT 8.6.1.6
About this repo
- which branch/tag/commit are you using? yolov5-v7.0
- which model? yolov8n
Your problem
In the detection of 72 pictures, in the detection of a certain number of pictures to start reasoning and post-processing time increased significantly, and finally remain in a picture 72ms. should not be a hardware problem, I changed to the server or will have this problem. 在检测72张图片的时候,在检测到某个数量的图片开始推理和后处理时间明显增加,最后保持在一张图片72ms.应该不是硬件的问题,我换成服务器还是会有这个问题
观测下GPU的频率和功耗有没有变化
观测下GPU的频率和功耗有没有变化
我在cmd中用nvidia-smi -l 1看了下,整个过程中GPU的功率在4W到24W之间,GPU当前的使用率在1%到17%之间,应该不是gpu的限制引起的
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.