keras-attention
keras-attention copied to clipboard
Concatenate two AttentionDecoders raise ValueError
Hi,
I have two networks that I want to concatenate. So, here is the piece of code
...
a = Bidirectional(LSTM(256, return_sequences=True))(input_a)
a = AttentionDecoder(128, 128)(a)
...
b = Bidirectional(LSTM(256, return_sequences=True))(input_b)
b = AttentionDecoder(128, 128)(b)
...
c = concatenate([a, b])
d = Model([input_a, input_b], c)
This raise ValueError: The name "AttentionDecoder" is used 2 times in the model. All layer names should be unique.
Any idea how to deal with this problem? I already comment name=AttentionDecoder
inside the class/funcion.
It's late but I hope it can help someone else: uncomment name='AttentionDecoder' inside the class, and then do the following while using AttentionDecoder class:
... a = Bidirectional(LSTM(256, return_sequences=True))(input_a) a = AttentionDecoder(128, 128, name='AttentionDecoder1')(a) ... b = Bidirectional(LSTM(256, return_sequences=True))(input_b) b = AttentionDecoder(128, 128, name='AttentionDecoder2')(b) ... c = concatenate([a, b]) d = Model([input_a, input_b], c)