ReLA icon indicating copy to clipboard operation
ReLA copied to clipboard

Question about Lang attn in RLA module?

Open nero1342 opened this issue 1 year ago • 0 comments

Hi, I have a question related to RLA module.

`

  lang_feat_att = self.lang_proj(lang_feat_att)
  lang_feat_att = self.RLA_lang_att(output, lang_feat_att.permute(1,0,2)) * F.sigmoid(self.lang_weight)
  output = output + lang_feat_att * self.rla_weight

`

It seems that RLA_lang_att does not contribute so much. I have tried to remove these lines of code and the result kept the same. Moreover, with self.rla_weight=0.1 and only used for the first layer, the lang_feat_att may not affect to the output. However, in the paper, I saw that it improves ~1% in performance. Is there any mistake or I understood in a wrong way?

nero1342 avatar Nov 27 '23 07:11 nero1342