tensorflow-onnx icon indicating copy to clipboard operation
tensorflow-onnx copied to clipboard

Trying to convert LSTM to ONNX

Open st1992 opened this issue 3 years ago • 1 comments

Trying to convert LSTM to onnx

WARNING - ONNX Failed to infer shapes and dtypes for [model_1/lstm/PartitionedCall/strided_slice_2, type: Slice] Traceback (most recent call last): File "/usr/local/lib/python3.7/dist-packages/tf2onnx/schemas.py", line 154, in infer_onnx_shape_dtype inferred_model = shape_inference.infer_shapes(model_proto, strict_mode=True) File "/usr/local/lib/python3.7/dist-packages/onnx/shape_inference.py", line 42, in infer_shapes inferred_model_str = C.infer_shapes(model_str, check_type, strict_mode, data_prop) onnx.onnx_cpp2py_export.shape_inference.InferenceError: [ShapeInferenceError] Shape inference error(s): (op_type:Slice, node name: model_1/lstm/PartitionedCall/strided_slice_2): [TypeInferenceError] Element type of input 0 unknown

System information

  • Google Colab
  • Latest tf2onnx 1.10

Here is the notebook https://colab.research.google.com/drive/1j8tQTwcibc1ZnbQFoZ7EHkUB9Wr6nbsX?usp=sharing

st1992 avatar Jan 17 '22 11:01 st1992

Could not repro this issue by the code in colab. Could you please retry it with the latest tensorflow and tf2onnx?

If the problem still exists, please share us with details and those versions in your environment.

fatcat-z avatar Sep 13 '22 12:09 fatcat-z