long-range-arena
long-range-arena copied to clipboard
Question regarding Pathfinder and Listops performance
Hi, thank you for releasing code for this inspiring work! When I was trying to reproduce the results of Transformer and Linear Transformer on Pathfinder32 and Listops tasks, I encountered the following problems:
(1) Transformer and Linear Transformer only got about 50% acc on Pathfinder32 task. If I replaced the fixed positional encoding (in official config) with learnable positional embedding, Transformer reached around 70%, but Linear Transformer stayed at 50%.
(2) On the Listops task, Transformer only had about 17% acc with fixed positional encoding (official config) or learnable position embedding.
Thank you very much for your help!