Saad Farooq

Results 3 comments of Saad Farooq

It's late but I hope it can help someone else: uncomment name='AttentionDecoder' inside the class, and then do the following while using AttentionDecoder class: ... a = Bidirectional(LSTM(256, return_sequences=True))(input_a) a...

@John-8704, I used SeqSelfAttention available in keras_self_attention library.

Unfortunately, I haven't implemented seq-to-seq architecture. The following link may help you understand how to use attention in seq-to-seq models. www.tensorflow.org/tutorials/text/nmt_with_attention