mmdetection-to-tensorrt icon indicating copy to clipboard operation
mmdetection-to-tensorrt copied to clipboard

Converted model.engine does not work properly on DeepStream

Open xarauzo opened this issue 2 years ago • 0 comments

I have converted a MMDet model using this tool and I got an output 'model.engine' model. However, when using DeepStream (with the amirstan plugin) the inference does not work as expected. I get no errors from TensorRT during inference. With a 0.5 threshold I get no detections shown. I decreased the threshold to 0.1 (just to see what happens) and I get a lot of bounding boxes (but none of them are correct).

I am using DeepStream 5.0 on a Jetson Xavier NX, running Jetpack 4.4 (I can't change neither the DeepStream nor the Jetpack versions).

xarauzo avatar Jul 14 '22 14:07 xarauzo