keras-attention icon indicating copy to clipboard operation
keras-attention copied to clipboard

Concatenate two AttentionDecoders raise ValueError

Open bagustris opened this issue 5 years ago • 1 comments

Hi,

I have two networks that I want to concatenate. So, here is the piece of code

...
a = Bidirectional(LSTM(256, return_sequences=True))(input_a)
a = AttentionDecoder(128, 128)(a)
...
b = Bidirectional(LSTM(256, return_sequences=True))(input_b)
b = AttentionDecoder(128, 128)(b)
...
c = concatenate([a, b])
d = Model([input_a, input_b], c) 

This raise ValueError: The name "AttentionDecoder" is used 2 times in the model. All layer names should be unique.

Any idea how to deal with this problem? I already comment name=AttentionDecoder inside the class/funcion.

bagustris avatar Mar 27 '19 14:03 bagustris

It's late but I hope it can help someone else: uncomment name='AttentionDecoder' inside the class, and then do the following while using AttentionDecoder class:

... a = Bidirectional(LSTM(256, return_sequences=True))(input_a) a = AttentionDecoder(128, 128, name='AttentionDecoder1')(a) ... b = Bidirectional(LSTM(256, return_sequences=True))(input_b) b = AttentionDecoder(128, 128, name='AttentionDecoder2')(b) ... c = concatenate([a, b]) d = Model([input_a, input_b], c)

BigWheel92 avatar Nov 24 '19 04:11 BigWheel92