tutorials
tutorials copied to clipboard
Export ONNX model with tensor shapes included
Ask a Question
Question
Is it possible to export ONNX models with tensor shapes included so that shape inference need not be run when importing a model? How to do that for example from PyTorch?
Further information
-
Relevant Area: shape_inference
-
When a model is exported with ops not present in ONNX included in them (torch.onnx.export(operator_export_type=ONNX_FALLTHROUGH)), shape inference cannot be run on the resulting ONNX model.
-
Is this issue related to a specific model?
No
Notes
It would also be a possible solution if a shape inference function could be injected into the Python shape inference code (not C++).
Should be solved by https://github.com/onnx/onnx/issues/3281. Thanks