GeometricFlux.jl icon indicating copy to clipboard operation
GeometricFlux.jl copied to clipboard

implement graph concatenation

Open CarloLucibello opened this issue 4 years ago • 3 comments

When training on multiple small graphs, typically one batches several graphs together into a larger graph for efficiency. This operation is called blockdiag in SparseArrays and LightGraphs.jl.

For FeaturedGraphs, node and edge features should be vertically concatenated in the resulting graph. I'm not sure how we should handle global features, maybe we should just require them to be == nothing for all graphs as a start

CarloLucibello avatar Aug 06 '21 10:08 CarloLucibello

For the same issue, there may be another approach to deal with this. Does parallelism being considered?

I'm not sure how we should handle global features, maybe we should just require them to be == nothing for all graphs as a start

I think the global feature can be batched up to pass layers. For example, an MLP?

yuehhua avatar Aug 08 '21 12:08 yuehhua

For the same issue, there may be another approach to deal with this. Does parallelism being considered?

in GNN the graph size is essentially equivalent to the batch size, so yes graph concatenation is done in order to leverage parallelized operations

CarloLucibello avatar Aug 10 '21 08:08 CarloLucibello

The docs suggest this has been implemented, but the issue being open suggests it has not. Can someone clarify this?

eahenle avatar Sep 18 '23 19:09 eahenle