RecBole icon indicating copy to clipboard operation
RecBole copied to clipboard

[🐛BUG] SINE attention weighting implementation

Open Elvenson opened this issue 1 year ago • 1 comments

Description:

Based on the SINE paper, in the attention weighting part, they added the trainable positional embeddings to the input embeddings so that the model can use item position to calculate Pt|k. But in RecBole SINE attention weighting implementation, I saw that you still use the original input for calculation.

Here is the detail in the paper for reference: Screenshot 2024-02-08 at 5 19 25 PM

Elvenson avatar Feb 08 '24 09:02 Elvenson

@Elvenson Thank you for your advise to RecBole. We will revise the code according to the paper, test it on the datasets and update it soon.

Fotiligner avatar Mar 03 '24 06:03 Fotiligner