Cannot Export to Tesnorflow or Tflite to run on mobile
ONNX-TF only supports up to opset 13 for some operators (like Where), but Your model uses operators (like aten::scaled_dot_product_attention) that are only supported in ONNX opset 14 or higher. Summary of your situation:
If you export with opset 13: PyTorch cannot export your model because it uses newer operators. If you export with opset 16: ONNX-TF cannot convert your model because it does not support some ops at that version.
@apcarita have you tried this PR for tflite conversion: https://github.com/roboflow/rf-detr/pull/45
seems to work for me, will test on device and update. thanks !
seems to work for me, will test on device and update. thanks !
how is the performance of the tflite model in mobile device?
Awesome thank you!
@gpokat do you have any update for us, how it performs on Android with tflite format? Would be very interesting. Thx
Guys, im also very curious, but the latest tests shows that we are lost accuracy during double converting from pytorch to tflite via onnx especially for quantized variant. Tested on qnx and performance is not much bigger then yolov5 with fp16 in compare. Not tested on Android, but here a plenty of work to do before real benchmarking which i have no time for now.
Hi guys, Im also very interested in this, do anyone have an update on this?