Attention_Network_With_Keras icon indicating copy to clipboard operation
Attention_Network_With_Keras copied to clipboard

An example attention network with simple dataset.

Results 2 Attention_Network_With_Keras issues
Sort by recently updated
recently updated
newest added

Why not take the output of the previous time step as the input of the next time step, together with context as the input?

Hi, Thank you for your example. I'm trying to use this Attention example in my LSTM model. However, in `def attention layer`, the line `h = Lambda(lambda X: K.zeros(shape=(K.shape(X)[0], n_h)))(X)`...

question