Recurrent-Knowledge-Graph-Embedding icon indicating copy to clipboard operation
Recurrent-Knowledge-Graph-Embedding copied to clipboard

How do you implement attention-gated hidden layer

Open WenssonS opened this issue 4 years ago • 0 comments

I find there only exists one file named "LSTMTagger.py" which implements RKGE model. But in this file, the output of LSTM layer is used directly to form the final h which is the input of maxpooling layer. Maybe I omit something important. So could you tell me how you implement attention-gated hidden layer?

WenssonS avatar Aug 02 '20 09:08 WenssonS