egnn-pytorch icon indicating copy to clipboard operation
egnn-pytorch copied to clipboard

Pytorch-Geometric Version Attention code is buggy

Open ItamarChinn opened this issue 2 years ago • 1 comments

First of all - this is a great repo and thank you for this. The pyg version however has some bugs with the attention.

Just a few that I have encountered:

  1. In forward method attention layer is at index -1 not 0 and EGNN layer is index 0 not -1 (which is the opposite in the other implementation).
  2. self.global_tokens init has undefined var dim
  3. Uses GlobalLinearAttention from other implementation although GlobalLinearAttention_Sparse is defined in the file (not sure if this is a bug or on purpose?

I have refactored a lot of the code, but can try and do a PR in a few days

ItamarChinn avatar Jan 22 '23 22:01 ItamarChinn