Álvaro Faubel Sanchis
Álvaro Faubel Sanchis
Hi @razvypp, I was wondering if there are any updates on this. I’m particularly interested in support for dynamic_axes when converting to ONNX. Looking forward to any news—thanks in advance!
I’m looking for an ONNX conversion that accepts dynamic_axes, to perform batch inference with more than one sample at a time. Thanks!
Hi, I was wondering if there are any updates on this. I’m particularly interested in support for dynamic_axes when converting to ONNX. Looking forward to any news—thanks in advance!
Thank you so much, @itskyf, for your contribution! Have you had a chance to test whether the execution works with ONNX Runtime?
Hi @ZhengPeng7, I believe the issue arises because ONNX has implemented the `DeformConv` operator, but unfortunately, ONNX Runtime does not currently support it. As a result, any code that includes...
@itskyf Could you please provide the code in which you have converted the dynamic batched model to TensorRT? Thanks in advance!
Hi @ZhengPeng7, I might be able to help. To export with opset >19, you’ll need to update your PyTorch version to >2.4. In the provided example, it uses opset 19...
Hi @ZhengPeng7, you’re correct. It might be worth testing if an onnxruntime session works by specifying the TensorRT execution provider like this: `sess = ort.InferenceSession('model.onnx', providers=['TensorrtExecutionProvider', 'CUDAExecutionProvider'])` If that doesn’t...
Same issue here
Hi, Thank you so much for taking the time to reply! I wanted to ask specifically about the configurations, losses and backbone you would recommend for my use case. Are...