Philipp Schmidt

Results 40 comments of Philipp Schmidt

Hello @Tabrizian and happy new year! Sorry for the late reply, we were pretty busy in Q4. I'm hoping we can address this together this year, we are happy to...

+1 for this, the postprocessing is absolutely necessary for good throughput. It should be fairly easy to port this from the existing yolo variants.

Relevant code for this is here: https://github.com/WongKinYiu/yolov7/blob/84932d70fb9e2932d0a70e4a1f02a1d6dd1dd6ca/models/experimental.py#L111 Classes **ORT_NMS, TRT_NMS, ONNX_ORT, ONNX_TRT, End2End** should be compatible with yolo nas I believe.

Also they enable native ONNX NMS as well if I'm not mistaken. So not only can the engine exported to TensorRT, but also the native ONNX backends work with NMS...

The reason they are defined for yolov7 is this manual step during export: ![image](https://github.com/Deci-AI/super-gradients/assets/25586333/d229c35f-1039-4671-aed1-a44b9fd7df9e) ![image](https://github.com/Deci-AI/super-gradients/assets/25586333/9ae3828e-0e06-4726-bf8d-75aed290449c)

Basically ONNX can not know the output dimensions of the non-native plugin. So we have to specify the dimensions manually instead. Same way you can specify the names of the...

1) Change the method "convert_to_onnx" to make that adjustment before serializing to file or 2) be lazy and just load the model from the onnx file again like yolov7 does...

The model dimensions in onnx have no function. TensorRT is able to understand the output dimensions of its own plugin, so the TensorRT engine will be good anyway. But I...

Please post the output of netron with your code applied.