pytorch_geometric icon indicating copy to clipboard operation
pytorch_geometric copied to clipboard

Fix incorrect SparseTensor handling in GCNConv forward pass

Open KAVYANSHTYAGI opened this issue 8 months ago • 3 comments

Bug Description In the current GCNConv.forward() implementation, SparseTensor inputs are passed through gcn_norm, but the resulting normalized tensor is not unwrapped into edge_index and edge_weight.

This leads to a faulty call to self.propagate(...), which expects edge data in coordinate (COO) format rather than as a SparseTensor. Consequently, message passing yields incorrect results when SparseTensor is used.

Fix This PR adds logic to extract edge_index and edge_weight from the normalized SparseTensor using .coo() and stacks them into a proper format. It also caches the unwrapped tensors if self.cached is True.

Impact Ensures proper input for self.propagate(...) when using SparseTensor.

Experimental correctness restored.

Fully backward compatible.

Related Fixes #10047

Let me know if you'd like a test case..............happy to contribute one!

KAVYANSHTYAGI avatar Apr 24 '25 16:04 KAVYANSHTYAGI

I think this already works as expected. Am I missing something?

rusty1s avatar May 20 '25 11:05 rusty1s

@KAVYANSHTYAGI

Let me know if you'd like a test case..............happy to contribute one!

can you add a test case?

Kh4L avatar May 20 '25 18:05 Kh4L

I think this already works as expected. Am I missing something?

The issue arises specifically when a SparseTensor is passed as edge_index into GCNConv with cached=True. After applying gcn_norm..... the resulting normalized tensor isn't unwrapped back into (edge_index, edge_weight) before self.propagate() is called.......which leads to incorrect results.

This patch ensures that when SparseTensor is used, we correctly extract row, col, and edge_weight via .coo(), and cache them properly.

KAVYANSHTYAGI avatar May 22 '25 08:05 KAVYANSHTYAGI