Pytorch_Retinaface
Pytorch_Retinaface copied to clipboard
Is there any mehtod for converting mobilenet0.25 to TensorRT engine?
Hello all.
I am currently working with MobiletNet0.25 to convert the model to TensorRT.
I tried conversion with TensorRT environment supported by Nvidia.
However I couldn't get success.
Do you guys have any method?
Thanks in advance.
@G-Bong I have a fork with TRT engine conversion added. https://github.com/gan3sh500/retinaface-pytorch
@gan3sh500 Thank you so much. Would you mind if I study your code for the TensorRT skill ? :)
@G-Bong go ahead. It is mostly modified from the links I mention in the code. That code is just specific for single input, single output.
@gan3sh500 Thanks a lot
@gan3sh500 Sir, I have a question. When I tried to convert the mobilenet to tensorrt, the interpolate layer occurs the error for onnx to tensorrt. Didn't you meet the error? If you solved, how did it??
@G-Bong I have just solved it, The question occurs when PyTorch F.interpolate module converts to ONNX-Resize layer, The interpolate size param can't mapping with ONNX-Resize scale_factor param. So, you can attempt to set scale_factor=2 in interpolate directly. like this:
up3 = F.interpolate(output3, scale_factor=2, mode="nearest") up2 = F.interpolate(output2, scale_factor=2, mode="nearest")
BTW, The net input size must be the same
hello @gan3sh500 can you share your onnx2tensorrt conversion?
hello @gan3sh500 can you share your onnx2tensorrt conversion?
Just using TensorRt>7.1
@QuantumLiu why and where is tensorrt Retinaface in tensorrt >7.1?
@QuantumLiu why and where is tensorrt Retinaface in tensorrt >7.1?
I mean use trtexec
of TensorRt>7.1 to generate engine file from onnx file. TensorRT 7.1 surpport F.interpolate
function.