Swin-Transformer-TensorRT icon indicating copy to clipboard operation
Swin-Transformer-TensorRT copied to clipboard

Does it support dynamic batch inference?

Open wangjingg opened this issue 2 years ago • 4 comments

When I try onnx->tensorrt and use dynamic batch, the tensorrt output is all [0,0,0,0,.....]

wangjingg avatar Feb 28 '22 08:02 wangjingg

When I try onnx->tensorrt and use dynamic batch, the tensorrt output is all [0,0,0,0,.....]

Tried dynamic batch, failed. Both onnx model and tensorRT engine can not output the expected output. Maybe this issue has some relationship with the implementation of Swin Transformer. Not sure.

maggiez0138 avatar Mar 08 '22 17:03 maggiez0138

When I cancel the softmax, in pytorch->onnx(dynamic batch), It's work, you can try it,

wangjingg avatar Mar 09 '22 07:03 wangjingg

When I cancel the softmax, in pytorch->onnx(dynamic batch), It's work, you can try it,

Thanks for your info, will do the attempt.

maggiez0138 avatar Mar 17 '22 09:03 maggiez0138

Hello there. I have the exact same problem with dynamic inputs, but I could not find a way to cancell softmax during converting pytorch model to onnx. Can you help me through this?

fatemebafghi avatar Oct 19 '22 08:10 fatemebafghi

Hello there. I have the exact same problem with dynamic inputs, but I could not find a way to cancell softmax during converting pytorch model to onnx. Can you help me through this?

Updated the repo. Now dynamic input is supported.

maggiez0138 avatar Dec 13 '22 12:12 maggiez0138