dgl icon indicating copy to clipboard operation
dgl copied to clipboard

canonical_etypes should be used in HeteroGraphConv

Open sbocconi opened this issue 3 years ago • 2 comments

🐛 Bug

Trying to train this kind of graphs:

metagraph=[('tweet', 'url', 'links'), ('tweet', 'user', 'mentions'), ('tweet', 'hashtag', 'tags'), ('user', 'url', 'links'), ('user', 'user', 'mentions'), ('user', 'hashtag', 'tags'), ('user', 'tweet', 'tweets')])

leads to error:

dgl._ffi.base.DGLError: Edge type "links" is ambiguous. Please use canonical edge type in the form of (srctype, etype, dsttype)

but HeteroGraphConv has as input:

mods : dict[str, nn.Module] Modules associated with every edge types.

So canonical types cannot be used.

It seems to be impossible to use HeteroGraphConv in case the same relation is used between different types of nodes.

see also #3435

To Reproduce

Steps to reproduce the behavior:

  1. Create a graph with the same relation name between more than one couple of nodes, such as ('tweet', 'url', 'links'), ('user', 'url', 'links'),
  2. Create a heterograph with something like
HeteroGraphConv({
            name: GraphConv(in_size, out_size) for name in etypes
        })
  1. Call the forward function of HeteroGraphConv

Expected behavior

It should be possible to specify canonical etypes as an input to HeteroGraphConv, and therefore remove the ambiguity

Environment

  • DGL Version (e.g., 1.0): dgl==0.6.0

Additional context

sbocconi avatar Jan 16 '22 20:01 sbocconi

I also encounterd this error just a few minutes ago.

LawsonAbs avatar Jan 23 '22 03:01 LawsonAbs

I think there are two use cases:

  • Use the same NN module for ('tweet', 'url', 'links') and ('user', 'url', 'links') as a way of parameter sharing/regularization.
  • Use different NN modules for ('tweet', 'url', 'links') and ('user', 'url', 'links').

I believe both use cases are valid and we should extend HeteroGraphConv to support them. I've marked this as a feature request to be implemented.

jermainewang avatar Aug 01 '22 09:08 jermainewang

Thank you very much for this fix, is this fix also going to be included into the cuda versions, for example dgl-cu111?

sbocconi avatar Sep 20 '22 07:09 sbocconi

It is currently not in the latest v0.9.1 release but will be in the nightly build after a day.

jermainewang avatar Sep 20 '22 08:09 jermainewang