HGNN icon indicating copy to clipboard operation
HGNN copied to clipboard

About graph representation

Open JCwww opened this issue 5 years ago • 1 comments

Hi there, May I ask why you using incidence matrix(H) to represent hypergraph instead of using adjancy matrix(A)? According to hypergraph learning algorithms, both matrix can be used to represent a hypergraph. And how do you define the value of weight matrix (W) ?

Thanks!

JCwww avatar Dec 25 '19 02:12 JCwww

Hi, Only in this case that constructing hypergraph via vertex and its one-hop neighbors, the hypergraph incidence matrix has exactly the same format (every entry and dimension) to graph adjacency matrix. But hypergraph can also be constructed by the one-hop neighbor and two-hop neighbor with dimension N x 2N. Hypergraph structure has more flexible format with dimension N x M, where M denotes the hyperedge number in range 1 and 2^N. While the graph adjacency matrix is always in dimension N x N. More precisely, a simple graph is just a special case of the 2-uniform hypergraph.

The weight matrix W. In the entire paper, the W is always an identity matrix, which gives the same weight to every hyperedge. A more reasonable weighed hypergraph neural networks can be done in the future work.

Thanks for your attention! Yifan

yifanfeng97 avatar Dec 30 '19 02:12 yifanfeng97

I think although W is always the identity matrix, the adjacency matrix A is calculated in the code, which should play the role of the weight to a certain extent.

hetang-wang avatar Feb 28 '24 09:02 hetang-wang