keras-language-modeling
keras-language-modeling copied to clipboard
Trainable weights in AttentionLSTMWrapper
This statement (https://github.com/codekansas/keras-language-modeling/blob/master/attention_lstm.py#L106) overwrites the trainable_weights of the inner LSTM layer. It should add to the weights of the LSTM Layer instead.