how_attentive_are_gats icon indicating copy to clipboard operation
how_attentive_are_gats copied to clipboard

Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)

Results 4 how_attentive_are_gats issues
Sort by recently updated
recently updated
newest added

The performance of GAT in origin paper is outperform than GraphSage in PPI. Why not you compare GATv2 on PPI?

Thanks for your great contribution!! I'm confused about Figure 1 (a) in your paper. Which layer of GAT is this attention matrix in? Is the attention matrix of all layers...

In this repo,there is no method to draw Figure 1 in this paper.

Hello, may I ask if you could provide the specific parameter settings for reproducing the test results on the PubMed dataset in the paper?