keras-self-attention icon indicating copy to clipboard operation
keras-self-attention copied to clipboard

Attention mechanism for processing sequential data that considers the context for each timestamp.

Results 4 keras-self-attention issues
Sort by recently updated
recently updated
newest added

The layer works fine on CPU, but I have this error on tensorflow GPU: Blas xGEMV launch failed : a.shape=[1,2000000,4], b.shape=[1,4,1], m=2000000, n=1, k=4

Hi, Is it possible to put the self-attention layer from the library after the input vector (word embeddings) and before the BiLSTM layer? How can the equations of the self-attention...

My question is: For the additive self-attention approach, are word embeddings from other timestamps taken into consideration for calculating the attention weights or only from the current timestamp (meaning word...

Hello,I just want to know Which paper does Local Attention refer to,thank you