[Official] ONNX Export Support with Dynamic Shapes
As previously discussed in #20, many users have encountered challenges when exporting models to ONNX, particularly when handling dynamic input shapes. The main difficulties include:
-
Untraceable dynamic shape induction: ONNX export requires all dynamic reshaping operations to be driven by explicit shape tensors passed as inputs. This is often non-trivial to restructure.
-
On-the-fly UV encoding: Some modules generate UV encodings based on the input aspect ratio and spatial dimensions. These often introduce large constant tensors into the exported graph, which bloats the model and complicates correct export.
We are working on resolving these issues. Feel free to leave a comment here if you’ve run into similar problems or have ideas on how to improve the export process.
Refer to docs/onnx.md.
Hello@EasternJournalist , I encountered the following issue when using Customized Exportation with Dynamic Shape, even though I have already set all antialias=False, the problem still persists. In addition, when I try to convert the exported ONNX model to TensorRT, it fails due to the presence of an If node in the ONNX model. Could you please advise how to resolve this? Thank you!
torch.onnx.errors.UnsupportedOperatorError: Exporting the operator 'aten::_upsample_bilinear2d_aa' to ONNX opset version 14 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub: https://github.com/pytorch/pytorch/issues.
@EasternJournalist, in the example onnx conversion script, the onnx_compatible_mode flag is set to true the following way:
model.onnx_compatible_mode = True # Enable ONNX compatible mode
Perhaps I'm overlooking something, but it seems like this flag is not propagated into the model's encoder, and thus the encoder will remain with an uninitialized attribute onnx_compatible_mode, logging the encoder's attribute before/after setting its value to the container appears to confirm this.
@MicalWillen, did you set antialias to False via the onnx_compatible_mode flag?
Thanks!
yes, I find model.onnx_compatible_mode = True is not vaild
Hi everyone. Sorry for the silly mistake. I missed updating the main model file. It’s now fixed in #101.
Let me know if there is any other issue.