long-range-arena icon indicating copy to clipboard operation
long-range-arena copied to clipboard

Question regarding Pathfinder and Listops performance

Open LeoXinhaoLee opened this issue 2 years ago • 2 comments

Hi, thank you for releasing code for this inspiring work! When I was trying to reproduce the results of Transformer and Linear Transformer on Pathfinder32 and Listops tasks, I encountered the following problems:

(1) Transformer and Linear Transformer only got about 50% acc on Pathfinder32 task. If I replaced the fixed positional encoding (in official config) with learnable positional embedding, Transformer reached around 70%, but Linear Transformer stayed at 50%.

(2) On the Listops task, Transformer only had about 17% acc with fixed positional encoding (official config) or learnable position embedding.

Thank you very much for your help!

LeoXinhaoLee avatar Sep 25 '23 01:09 LeoXinhaoLee

For ListOps, I think the checkpointing is broken somehow. The average performance across runs appears to be the same as a randomly initialized model.

However, you can get the correct result for the trained model by evaluating on the test set at the end of training, rather than saving the model to a checkpoint and reloading it.

lucaslingle avatar Jan 07 '24 04:01 lucaslingle

@LeoXinhaoLee

By the way, did you change any other settings for Pathfinder32? I tried your suggestion but I am still getting only 50% accuracy for a vanilla Transformer, even with learnable position embeddings.

Thanks for any insights you can provide!

lucaslingle avatar Jan 07 '24 08:01 lucaslingle