onnx-tensorflow
onnx-tensorflow copied to clipboard
lstm convert error
Describe the bug
A clear and concise description of what the bug is. ocr rec model crnn+lstm and convert to onnx.
ValueError: in user code:
/Users/xiaomi/ml_python/onnx-tensorflow/onnx_tf/backend_tf_module.py:99 __call__ *
output_ops = self.backend._onnx_node_to_tensorflow_op(onnx_node,
/Users/xiaomi/ml_python/onnx-tensorflow/onnx_tf/backend.py:347 _onnx_node_to_tensorflow_op *
return handler.handle(node, tensor_dict=tensor_dict, strict=strict)
/Users/xiaomi/ml_python/onnx-tensorflow/onnx_tf/handlers/handler.py:59 handle *
return ver_handle(node, **kwargs)
/Users/xiaomi/ml_python/onnx-tensorflow/onnx_tf/handlers/backend/lstm.py:287 version_7 *
return cls._common(node, **kwargs)
/Users/xiaomi/ml_python/onnx-tensorflow/onnx_tf/handlers/backend/lstm.py:249 _common *
outputs, states = cls.rnn(x, tf.compat.v1.nn.rnn_cell.LSTMCell,
/Users/xiaomi/ml_python/onnx-tensorflow/onnx_tf/handlers/backend/rnn_mixin.py:49 rnn *
outputs, states = tf.compat.v1.nn.bidirectional_dynamic_rnn(
/usr/local/lib/python3.8/site-packages/tensorflow/python/util/deprecation.py:346 new_func **
return func(*args, **kwargs)
/usr/local/lib/python3.8/site-packages/tensorflow/python/util/dispatch.py:206 wrapper
return target(*args, **kwargs)
/usr/local/lib/python3.8/site-packages/tensorflow/python/ops/rnn.py:438 bidirectional_dynamic_rnn
output_fw, output_state_fw = dynamic_rnn(
/usr/local/lib/python3.8/site-packages/tensorflow/python/util/deprecation.py:346 new_func
return func(*args, **kwargs)
/usr/local/lib/python3.8/site-packages/tensorflow/python/util/dispatch.py:206 wrapper
return target(*args, **kwargs)
/usr/local/lib/python3.8/site-packages/tensorflow/python/ops/rnn.py:684 dynamic_rnn
(outputs, final_state) = _dynamic_rnn_loop(
/usr/local/lib/python3.8/site-packages/tensorflow/python/ops/rnn.py:764 _dynamic_rnn_loop
raise ValueError(
ValueError: Input size (depth of inputs) must be accessible via shape inference, but saw value None.
To Reproduce
Please give us instructions to reproduce your problem.
onnx-tf convert -i rec.onnx -o rec_n --logging_level DEBU
ONNX model file
Python, ONNX, ONNX-TF, Tensorflow version
This section can be obtained by running get_version.py from util folder.
- Python version: 3.8.6
- ONNX version: 1.9.0
- ONNX-TF version:1.9.0
- Tensorflow version: 2.6.0
Additional context
if i change file rnn_mixin.py as fallow, it can convert without error, but the model can't inference
if direction == "forward":
outputs, states = tf.compat.v1.nn.dynamic_rnn(cell_fw, x, **rnn_kwargs)
elif direction == "bidirectional":
print(x.shape)
x.set_shape([None, None, 288])
outputs, states = tf.compat.v1.nn.bidirectional_dynamic_rnn(
cell_fw, cell_bw, x, **rnn_kwargs)