diffpool icon indicating copy to clipboard operation
diffpool copied to clipboard

edge attributes

Open abisheksm opened this issue 6 years ago • 9 comments

This might be a stupid question - but do you use edge attibutes anywhere in the code? I can't seem to figure out where and how.

abisheksm avatar Jul 26 '19 15:07 abisheksm

I tried using the PyTorch Geometric implementation, dense_diff_pool(). When A contains edge attributes as an additional dimension, the following computation (using torch.matmul()) throws an exception: A(l+1)=S(l)^A(l)S(l), where S(l)^ is the transpose of S(l).

dtchang avatar Aug 12 '19 17:08 dtchang

i didn't use edge attributes here. should be easy to add. If categorical, you can have a different weight matrix for each edge type, and sum the messages for all edge types after every layer. if continuous, you can compute messages similar to GAT.

RexYing avatar Aug 13 '19 14:08 RexYing

I tried using the PyTorch Geometric implementation, dense_diff_pool(). When A contains edge attributes as an additional dimension, the following computation (using torch.matmul()) throws an exception: A(l+1)=S(l)^A(l)S(l), where S(l)^ is the transpose of S(l).

Could you check the dimension of S and A? S should be num_nodes x num_next_level_nodes; A should be num_nodes x num_nodes

RexYing avatar Aug 13 '19 14:08 RexYing

When A contains multiple edge attributes (many datasets, including mine, have such), its size is num_nodes x num_nodes x num_edge_attrs. I also reported this as an issue with PyG. The owner said dense_diff_pool() doesn't support multiple edge attributes, only a single edge attribute / weight. It would be important to add such support. It would be nice if you can help.

dtchang avatar Aug 13 '19 14:08 dtchang

Thanks for the suggestions!

For now i could think of using pytorch batch matmul

A_perm = A.permute(2, 0, 1) S_perm = S.unsqueeze(0) S_perm_T = S.T.unsqueeze(0)

A_next_level = S_perm_T @ A_perm @ S_perm A_next_level = A_next_level.permute(1, 2, 0)

This should give the next level adj of shape [num_clusters x num_clusters x num_edge_attrs]

Wonder if this works? This assumes a single clustering that takes into account of clustering the multi-edge-attr graphs.

RexYing avatar Aug 13 '19 14:08 RexYing

Per your suggestion, I made the following changes in dense_diff_pool(): if adj.dim() == 3: out_adj = torch.matmul(torch.matmul(s.transpose(1, 2), adj), s) else: # adj.dim() == 4 adj_perm = adj.permute(0, 3, 1, 2) s_perm = s.unsqueeze(1) s_t = s.transpose(1, 2) s_t_perm = s_t.unsqueeze(1) out_adj_perm = torch.matmul(torch.matmul(s_t_perm, adj_perm), s_perm) out_adj = out_adj_perm.permute(0, 2, 3, 1) That works out fine. Thanks much.

However, link_loss now calculation throws RuntimeError when adj.dim() = 4: link_loss = adj - torch.matmul(s, s.transpose(1, 2))

What changes should I make?

dtchang avatar Aug 13 '19 18:08 dtchang

The following changes do away RuntimeError: if adj.dim() == 4: adj = adj.unbind(3)[0]

This would produce good link_loss if the first edge attribute (type) is edge weight. Is there a better way?

dtchang avatar Aug 13 '19 19:08 dtchang

Does this only use one of the edge type dimensions? You can also have weight matrices in graph conv layer to have an extra dimension corresponding to the edge attribute dimension. So that you don't have to unbind and only use one dimension.

RexYing avatar Sep 18 '19 17:09 RexYing

Have you guys finished this question? I want a demo code to input my edge's weight.(1 D)

haojiang1 avatar Dec 02 '19 04:12 haojiang1