pygcn icon indicating copy to clipboard operation
pygcn copied to clipboard

The SpecialSpmmFunction Class question

Open Colorfu1 opened this issue 4 years ago • 0 comments

I noticed that SpecialSpmmFunction is the subclass of torch.autograd.Function and there is only one object in class SpGraphAttentionLayer.

`class SpGraphAttentionLayer(nn.Module): def init(self, in_features, out_features, dropout, alpha, concat=True): super(SpGraphAttentionLayer, self).init() self.in_features = in_features self.out_features = out_features self.alpha = alpha self.concat = concat

    self.W = nn.Parameter(torch.zeros(size=(in_features, out_features)))
    nn.init.xavier_normal_(self.W.data, gain=1.414)
            
    self.a = nn.Parameter(torch.zeros(size=(1, 2*out_features)))
    nn.init.xavier_normal_(self.a.data, gain=1.414)

    self.dropout = nn.Dropout(dropout)
    self.leakyrelu = nn.LeakyReLU(self.alpha)
    self.special_spmm = SpecialSpmm()`

But in official documents, there is a saying that Each function object is meant to be used only once (in the forward pass). I found the self.special_spmm forward twice in

`e_rowsum = self.special_spmm(edge, edge_e, torch.Size([N, N]), torch.ones(size=(N,1), device=dv)) # e_rowsum: N x 1

    edge_e = self.dropout(edge_e)
    # edge_e: E

    # Each function object is meant to be used only once (in the forward pass).
    h_prime = self.special_spmm(edge, edge_e, torch.Size([N, N]), h)`

Have I misunderstand sth.?

Colorfu1 avatar Oct 15 '19 12:10 Colorfu1