HieCoAttenVQA
HieCoAttenVQA copied to clipboard
Clearify using num_layers as n in LSTM implementation
Hello,
I try to re-implement your paper in Keras. Now, I'm struggling with your LSTM implementation.
You use num_layers as n for the LSTM initialization, but the num_layers should be the depth of the LSTM. Nevertheless, in the LSTM implementation it seems to be used as the number of timesteps L. Is this true?
https://github.com/jiasenlu/HieCoAttenVQA/blob/82b0bb093ce9f033c1a55e9da1a4344291184c19/misc/ques_level.lua#L18
https://github.com/jiasenlu/HieCoAttenVQA/blob/82b0bb093ce9f033c1a55e9da1a4344291184c19/misc/LSTM.lua#L18
Furthermore, there is createClones which creates multiple weights for each timestep as it seems. Is this supposed to be wanted as an LSTM should share the same weights through time or a Bug?
https://github.com/jiasenlu/HieCoAttenVQA/blob/82b0bb093ce9f033c1a55e9da1a4344291184c19/misc/ques_level.lua#L52