keras-attention
keras-attention copied to clipboard
Visualizing RNNs using the attention mechanism
Hello, I have a minor question, how is the dimension of encoded sequence determined? Is there any instructions of calculating the dimension?
Hi! Thank you for the code implementing attention, it was really helpful. You are referencing the article https://arxiv.org/pdf/1409.0473.pdf But it seems that your implementation differs from that in the article...
As pointed out in https://medium.com/@charlesenglebert/hello-f890944e39fd there is an issue in calculating the y_t in our model. https://github.com/datalogue/keras-attention/blob/856dbe66490a73c55dcfdc2f793aea6c7307b530/models/custom_recurrents.py#L275-L279 Should be `st` and not `stm` (see also eqn 4 in the [Bahdanau...