pyGAT icon indicating copy to clipboard operation
pyGAT copied to clipboard

code question

Open linbeijianbaoxia opened this issue 3 years ago • 1 comments

x = torch.cat([att(x, adj) for att in self.attentions], dim=1) is this code get atention coefficients, I can't understand how it works =.= Sorry, I'm a new person.

linbeijianbaoxia avatar Dec 14 '21 11:12 linbeijianbaoxia

I think in this line of code, x is processed by mutil-attention. Each att(x,adj) calculate one head of attention score. And these heads are concated and sent to another layer whose output's dimension is equal to the kinds of label. In this case the layer can be simular as the mlp layer in tradition attention algorithm. You can search the "multi-attention" for more infomation.

defensetongxue avatar Feb 20 '22 06:02 defensetongxue