RecBole
RecBole copied to clipboard
[🐛BUG] SINE attention weighting implementation
Description:
Based on the SINE paper, in the attention weighting part, they added the trainable positional embeddings to the input embeddings so that the model can use item position to calculate Pt|k
. But in RecBole SINE attention weighting implementation, I saw that you still use the original input for calculation.
Here is the detail in the paper for reference:
@Elvenson Thank you for your advise to RecBole. We will revise the code according to the paper, test it on the datasets and update it soon.