keras-onnx icon indicating copy to clipboard operation
keras-onnx copied to clipboard

Tensor Transformer-11-MultiHeadSelfAttention-Add/All:0 already processed

Open 34127chi opened this issue 4 years ago • 4 comments

Converting a keras model to onnx, I faced the following error: AssertionError: Tensor Transformer-11-MultiHeadSelfAttention-Add/All:0 already processed

###main code

import keras2onnx                                                                                                                                                   
onnx_model = keras2onnx.convert_keras(model, model.name, debug_mode=True)                                                                                           
temp_model_file = 'model.onnx'                                                                                                                                      
keras2onnx.save_model(onnx_model, temp_model_file)  

###debug log

Processing a keras layer - (Transformer-11-MultiHeadSelfAttention-Add: <class 'keras.layers.merge.Add'>)
        input : Transformer-10-FeedForward-Norm/add_1:0
        input : Transformer-11-MultiHeadSelfAttention-Dropout/cond/Merge:0
        input mask: Transformer-10-FeedForward-Add/All:0
        input mask: Transformer-10-FeedForward-Add/All:0
        output: Transformer-11-MultiHeadSelfAttention-Add/add:0
        output mask: Transformer-11-MultiHeadSelfAttention-Add/All:0
Processing a keras layer - (Transformer-11-FeedForward: <class 'bert4keras.layers.FeedForward'>)
Processing a tf node - Transformer-11-FeedForward/dense_72/add
        output: Transformer-11-FeedForward/dense_72/add:0
        input : Transformer-11-FeedForward/dense_72/Reshape_2:0
        input : Transformer-11-FeedForward/dense_72/Reshape_3:0
Processing a tf node - Transformer-11-MultiHeadSelfAttention-Add/All
        output: Transformer-11-MultiHeadSelfAttention-Add/All:0
Traceback (most recent call last):
  File "util.py", line 5, in <module>
    onnx_model = keras2onnx.convert_keras(model, model.name, debug_mode=True)
  File "python3.6/lib/python3.6/site-packages/keras2onnx/main.py", line 80, in convert_keras
    parse_graph(topology, tf_graph, target_opset, output_names, output_dict)
  File "python3.6/lib/python3.6/site-packages/keras2onnx/parser.py", line 842, in parse_graph
    graph, keras_node_dict, topo, top_level, output_names)
  File "python3.6/lib/python3.6/site-packages/keras2onnx/parser.py", line 606, in _parse_graph_core
    _on_parsing_tf_nodes(graph, nodes, varset, topology.debug_mode)
  File "python3.6/lib/python3.6/site-packages/keras2onnx/parser.py", line 319, in _on_parsing_tf_nodes
    operator.add_output(out0)
  File "python3.6/lib/python3.6/site-packages/keras2onnx/common/intop.py", line 72, in add_output
    assert False, "Tensor {} already processed".format(var.full_name)
AssertionError: Tensor Transformer-11-MultiHeadSelfAttention-Add/All:0 already processed

34127chi avatar Sep 07 '20 03:09 34127chi

We have transformers conversion in our nightly build here, would your model be one of them? Need tensorflow 2.2.0+. Which keras2onnx you use? try the code from source.

jiafatom avatar Sep 07 '20 04:09 jiafatom

try

No, the exported model is h5 format, it can't load by transformers package, i tried and failed. However i comments https://github.com/onnx/keras-onnx/blob/master/keras2onnx/common/intop.py#L71, and it works.

34127chi avatar Sep 09 '20 08:09 34127chi

try

No, the exported model is h5 format, it can't load by transformers package, i tried and failed. However i comments https://github.com/onnx/keras-onnx/blob/master/keras2onnx/common/intop.py#L71, and it works.

你好,我也遇到了同样的问题,最后你怎样解决的?

adzhua avatar Apr 30 '21 07:04 adzhua

did any solved this problem?

STHSF avatar Apr 30 '21 08:04 STHSF