GraphNeuralNetworks.jl
GraphNeuralNetworks.jl copied to clipboard
Dropout inside GATConv layer
In the (Optional) Exercises, it is mentioned dropout inside GATConv layer, but I did not find the keywords parameter about dropout. How can I set dropout inside GATConv layer.
Actually, that is a missing feature. We should add the dropout
option to GATConv
and GATv2Conv
PyG has it
@CarloLucibello Can adding something like this after line 366 here can do the job?
if l.dropout > 0 && Flux.training()
x = Flux.dropout(x, l.dropout)
not really, dropout is performed on the attention coefficients I think. It corresponds to masking the attention, see https://github.com/FluxML/NNlib.jl/blob/master/src/attention.jl