YOLOv8-TensorRT icon indicating copy to clipboard operation
YOLOv8-TensorRT copied to clipboard

YOLOv8 using TensorRT accelerate !

Results 57 YOLOv8-TensorRT issues
Sort by recently updated
recently updated
newest added

Hi, I am facing the runtime error issue while running the infer-pose.py. command to export the yolo-nano model: `yolo export model=runs/pose/train13/weights/nano_club_pose_model.pt format=onnx simplify=True opset=11` ONNX to TensorRT conversion command: `/opt/tensorrt/bin/trtexec...

I run Docker and build on Jetson Xavier. - Jetpack: 5.0.2 - TensorRT 8.4.1.5 - Docker image: nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-devel sudo docker run -it --rm nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-devel apt-install libopencv-dev git clone https://github.com/triple-Mu/YOLOv8-TensorRT cd...

Hello, I have a yolov8 model that I converted to engine. With the fp32 engine, the engine works good for inference and detect very well the object. /usr/src/tensorrt/bin/trtexec \ --onnx=yolov8s.onnx...

Fix this: nms using numpy is not equivalent to `torchvision.ops.nms` See #183

First of all, thank you very much for your work @triple-Mu . Let me describe the problem I encountered. * Commit id: 6c396351551c3617286d3b3abac2d5b6d54a8833 * TensorRT version: 8.2.4.2 * Driver Version:...

The model learned with multi-class(3) is performed normally with the yolov8 cli Predict command. I confirmed through debugging that only one class objects was detected when using the example here....

Hi i encountered error when i inference with my custom model as you can see there are a lot of bounding box and wrong inference.(all inference is round-about class) ![Screenshot...

C:\Users\15258>python3 "C:\Users\15258\Downloads\YOLOv8-TensorRT-main\YOLOv8-TensorRT-main\infer-det.py" --engine "C:\Users\15258\Desktop\model++\runs\detect\train26\weights\best.engine" --imgs "C:\Users\15258\Desktop\model++\000002 copy.jpg" --out-dir outputs --device cuda:0 Traceback (most recent call last): File "C:\Users\15258\Downloads\YOLOv8-TensorRT-main\YOLOv8-TensorRT-main\infer-det.py", line 1, in from models import TRTModule # isort:skip File "C:\Users\15258\Downloads\YOLOv8-TensorRT-main\YOLOv8-TensorRT-main\models\__init__.py", line...

## Env - onnx verson 1.15 - torch version 2.1.2 - tensorrt version 8.6.1 ## Your problem ![image](https://github.com/triple-Mu/YOLOv8-TensorRT/assets/16401446/f2444ade-6552-4007-88f6-9ad81bd053b3) An warning occurred when the model is converted to an onnx model....

Hi There, I need to infer with a batch size of 2. I have exported the model to onnx format using the command - `yolo export model=best.pt format=onnx simplify=True opset=11...