onnx2torch
onnx2torch copied to clipboard
Function `convert` failed with layer `LayerNormalization`. However, `BatchNormalization` succeeds.
This is ipython code (at colab) which makes an error.
Code
# !pip install tensorflow==2.6.4 onnx==1.12.0 onnx2torch git+https://github.com/onnx/tensorflow-onnx
import tensorflow as tf
import onnx
from onnx2torch import convert
with tf.device("/cpu:0"):
tf_model = tf.keras.Sequential()
tf_model.add(tf.keras.layers.Input((123,)))
tf_model.add(tf.keras.layers.LayerNormalization())
tf.keras.models.save_model(
tf_model,
"model.tf",
overwrite=True,
include_optimizer=False,
save_format=None,
signatures=None,
options=None,
save_traces=True
)
!python -m tf2onnx.convert --saved-model model.tf --output model.onnx --opset 11 --verbose
onnx_model = onnx.load("model.onnx")
encoder_pth = convert(onnx_model)
Error Message
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
[<ipython-input-2-2bc3e427f014>](https://localhost:8080/#) in <module>()
22 get_ipython().system('python -m tf2onnx.convert --saved-model model.tf --output model.onnx --opset 11 --verbose')
23 onnx_model = onnx.load("model.onnx")
---> 24 encoder_pth = convert(onnx_model)
1 frames
[/usr/local/lib/python3.7/dist-packages/onnx2torch/converter.py](https://localhost:8080/#) in convert(onnx_model_or_path, save_input_names, attach_onnx_mapping)
107 )
108
--> 109 torch_module, onnx_mapping = converter(onnx_node, onnx_graph)
110 if attach_onnx_mapping:
111 setattr(torch_module, 'onnx_mapping', onnx_mapping)
[/usr/local/lib/python3.7/dist-packages/onnx2torch/node_converters/batch_norm.py](https://localhost:8080/#) in _(node, graph)
24 def _(node: OnnxNode, graph: OnnxGraph) -> OperationConverterResult:
25 scale_value_name = node.input_values[1]
---> 26 scale = graph.initializers[scale_value_name]
27 scale = scale.to_torch()
28
KeyError: 'StatefulPartitionedCall/sequential_1/layer_normalization_1/ones:0'
Hello! I think the reason is LayerNorm is supported in 17 opset only. In your case LayerNorm is represented by multiple nodes and one of them is BatchNormalization. Onnx2torch expects scale, bias, input_mean, input_var attributes of BatchNorm to be in graph.initializers, but in your case they are dynamic (i.e. outputs of previous nodes). We will try to resolve your issue as soon as possible.