cvu
cvu copied to clipboard
custom model deteciton fail
import os
import time
import numpy as np
import cv2
from cvu.detector import Detector
def detect_image(weight,
image_path,
output_image,
classes="coco",
auto_install=True):
# load model
model = Detector(classes=classes,weight=weight, backend="tensorrt")
# read image
image = cv2.imread(image_path)
# inference
preds = model(image)
print(preds)
# draw image
preds.draw(image)
# write image
print(output_image)
cv2.imwrite(output_image, image)
if __name__ == "__main__":
classes = ['cell1',"cell2"]
detect_image("best.engine", "test.jpg", "result.jpg",classes)
I use yolov5 6.1 train model and use cvu convert tensorrt engine,use top code to infer,but I get follow error, my lable have two classes.
Traceback (most recent call last):
File "est_tensorrt.py", line 36, in <module>
detect_image("best.engine", "test.jpg", "result.jpg",classes)
File "est_tensorrt.py", line 21, in detect_image
preds = model(image)
File "/data/anaconda3/envs/torch17/lib/python3.7/site-packages/cvu/detector/yolov5/core.py", line 96, in __call__
outputs = self._model(processed_inputs)
File "/data/anaconda3/envs/torch17/lib/python3.7/site-packages/cvu/detector/yolov5/backends/yolov5_tensorrt.py", line 257, in __call__
preds = self._post_process(outputs)
File "/data/anaconda3/envs/torch17/lib/python3.7/site-packages/cvu/detector/yolov5/backends/yolov5_tensorrt.py", line 306, in _post_process
outputs = outputs[-1].reshape((1, -1, self._nc + 5))
ValueError: cannot reshape array of size 176400 into shape (1,newaxis,85)
I try modification yolov5_tensorrt.py self.nc can get result,but the result is not correct.
my env: os: win10 yolov5 6.1 python==3.8 cvu==0.0.1a1
@fanweiya can you share your custom onnx and pytorch model with us. We can try to debug on our end?
I just found with my custom engine weights I had to specify my classes (had only 3) for it to work. --classes blue yellow orange
Maybe it was trying to get all 80 coco classes and couldn't fit them?
Closing due to inactivity. Feel free to re-open.