gpytorch icon indicating copy to clipboard operation
gpytorch copied to clipboard

Inference concat on different dimensions

Open tawe141 opened this issue 4 years ago • 1 comments

Lately I've been playing with using graph convolutions as part of a deep kernel using torch-geometric. I'm using an ExactGP very similar to the one used in the examples with forward() implemented like this:

def forward(self, x, edge_index, batch):
...

where x has shape (N, D) and has the per-node features, edge_index is shape (2, n_edge), and batch is a vector of integers that describe what graph each node corresponds to in a batch of graphs.

Inference done by ExactGP concatenates any new inputs along dim=-2, which in this case would be perfectly fine for x but not for edge_index, which should be concatenated along dim=-1. I would like to propose a small change where ExactGP has a method for concatenating inputs to the training inputs such that others can inherit this class and change that concatenation method at will. Thoughts? Or perhaps there's a better way around this problem that I'm not thinking of?

tawe141 avatar Dec 01 '21 17:12 tawe141

Could the same not be achieved by transposing the edge_features input?

Balandat avatar Dec 02 '21 07:12 Balandat