attention-module
attention-module copied to clipboard
Element-wise summation
Hi, in your paper you say you are choosing element-wise summation to combine the two branches, however in your code you are using a product
https://github.com/Jongchan/attention-module/blob/1a23ae52aa4669ad655c41fc2bb6957ce5d70f6e/MODELS/bam.py#L48
Could you elaborate on this? Thanks
look at the ablation study part in the paper
sum gets better performance than prod, maybe you can modify prod to sum.
How accurate is your BAM - based cifari100?