LSTM_Attention icon indicating copy to clipboard operation
LSTM_Attention copied to clipboard

attention-based LSTM/Dense implemented by Keras

Results 5 LSTM_Attention issues
Sort by recently updated
recently updated
newest added

Thanks and I have following questions: 1. what is the difference of AttentionLSTM and AttentionLSTM_t in your code attention_lstm.py? (i.e. what do you mean by 'attention_vec'?) 2. Is the output...

What should I do if I follow the attention layer behind the CRF layer?

您好,看了您的Git受益匪浅,有个小疑问:AttentionDecoder 输入的time_step必须要和encoder的time_step一样,那么AttentionDecoder 输出的time_step也就等于encoder的time_step,可是输入和输出的time_step数量一般情况下是不一样的啊?比如输入的是个问题,输出的是个答案,问题和答案的词数量一般都不一样。

博主能不能给一下requirement文件