GeometricFlux.jl icon indicating copy to clipboard operation
GeometricFlux.jl copied to clipboard

Implementation of EGNN

Open yuehhua opened this issue 3 years ago • 2 comments

  • [x] EquivGraphConv layer
  • [ ] EGNN network and example

Ref. E(n) Equivariant Graph Neural Networks

yuehhua avatar Dec 14 '21 06:12 yuehhua

@emsal0 is working on this issue.

yuehhua avatar May 16 '22 02:05 yuehhua

My current design is in my branch; https://github.com/FluxML/GeometricFlux.jl/compare/master...emsal0:EGNN?expand=1. Tests are still failing so I haven't made a PR yet, and implementation not yet complete.

The input data to an EGNN layer is a graph where each node contains:

  • node features (called h in the paper), of dimension in_dim
  • positional encoding (called x in the paper), which consists of the rest of the vector, of dimension x_dim

Thus the input should come in the form of a vector of size in_dim + x_dim

In the paper an EGNN layer induces three neural networks:

  • phi_e, going from in_dim to an arbitrary intermediate dimension int_dim (standing for intermediate dimension)
  • phi_x, going from int_dim to dimension 1
  • phi_h, going from dimension (in_dim + int_dim) to out_dim

The layer should be constructed with either:

  • Arguments (in_dim, int_dim, out_dim) a tuple of ints specifying the feature input dimension in_dim (dimension of h), int_dim (arbitrary, output dimension of the phi_x function in the paper, and out_dim. This would by default construct one-layer MLPs for phi_e, phi_x, and phi_h
  • Arguments (nn_e, nn_x, nn_h), each of them functions (should be MLPs) with appropriate dimensions as specified by phi_e, phi_x, phi_h above. Maybe it would be good to validate the input to these so that they are necessarily Flux.Chain objects that are only populated by Flux.Dense at each layer, but I am unsure if that's the right decision to make. (Thus this path isn't implemented yet)

emsal0 avatar May 23 '22 18:05 emsal0