tensorflow-onnx
tensorflow-onnx copied to clipboard
Converting TF1 .meta weights to onnx yields to ONNX parsing error.
Describe the bug Converting TF1 .meta weights to ONNX yields to ONNX parsing error.
Urgency Moderate
System information
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): NAME="Ubuntu" VERSION="20.04.3 LTS (Focal Fossa)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 20.04.3 LTS" VERSION_ID="20.04"
- Tensorflow Version: 1.15.5 // also tried with tf 2.4.0
- Python version: Python 3.8.10
- Using tensorflow=1.15.5, onnx=1.10.2, tf2onnx=1.10.0/5cd3b5, tensorrt=8.0.3.4
To Reproduce Describe steps/code to reproduce the behavior. Please upload/link the model you are trying to convert if possible.
python -m tf2onnx.convert --checkpoint path-to-model.meta --output model-onnx-opset15-tf1-test.onnx --inputs inputs/input_img:0 --outputs resnet18-linknet/logits/BiasAdd:0 --opset 15 && polygraphy surgeon sanitize model-onnx-opset15-tf1-test.onnx --fold-constants --output model-onnx-opset15-tf1-test-folded.onnx && polygraphy run model-onnx-opset15-tf1-test-folded.onnx --trt
The input shape should be (512, 1024, 8) and the output shape is (512, 1024, 53). It is a semantic segmentation model (LinkNet). All the steps work smoothly until trying to run the ONNX model.
Error (same with opsets 15,14,13,12):
[TensorRT] WARNING: onnx2trt_utils.cpp:362: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32. [TensorRT] ERROR: ModelImporter.cpp:720: While parsing node number 570 [Neg -> "Neg__1238:0"]: [TensorRT] ERROR: ModelImporter.cpp:721: --- Begin node --- [TensorRT] ERROR: ModelImporter.cpp:722: input: "Sub__1222:0" output: "Neg__1238:0" name: "Neg__1238" op_type: "Neg"
[TensorRT] ERROR: ModelImporter.cpp:723: --- End node --- [TensorRT] ERROR: ModelImporter.cpp:725: ERROR: onnx2trt_utils.cpp:2031 In function unaryHelper: [8] Assertion failed: validUnaryType && "This version of TensorRT does not support the given operator with the given input data type." [E] In node 570 (unaryHelper): UNSUPPORTED_NODE: Assertion failed: validUnaryType && "This version of TensorRT does not support the given operator with the given input data type." [!] Could not parse ONNX correctly
I tried with TF1 and TF2 but both end up with this error.
Any help would be appreciated as converting these TF1 weights to ONNX is a crucial step to switch to using ONNX/TensorRT inference.
Thank you!
TensorRT doesn't support INT64. Is it possible to change the weights from INT64 to INT32 for your case?
It's been over 3 months, so closing this. Feel free to open a new one if the issue still exists.