GraphNeuralNetworks.jl icon indicating copy to clipboard operation
GraphNeuralNetworks.jl copied to clipboard

Dropout inside GATConv layer

Open afternone opened this issue 2 years ago • 3 comments

In the (Optional) Exercises, it is mentioned dropout inside GATConv layer, but I did not find the keywords parameter about dropout. How can I set dropout inside GATConv layer.

afternone avatar Feb 24 '23 14:02 afternone

Actually, that is a missing feature. We should add the dropout option to GATConv and GATv2Conv PyG has it

CarloLucibello avatar Mar 16 '23 05:03 CarloLucibello

@CarloLucibello Can adding something like this after line 366 here can do the job?

if l.dropout > 0 && Flux.training()
        x = Flux.dropout(x, l.dropout)

5hv5hvnk avatar Mar 16 '23 06:03 5hv5hvnk

not really, dropout is performed on the attention coefficients I think. It corresponds to masking the attention, see https://github.com/FluxML/NNlib.jl/blob/master/src/attention.jl

CarloLucibello avatar Mar 16 '23 07:03 CarloLucibello