attention-module
attention-module copied to clipboard
Bug in BAM code
It seems that there is an issue in the below line
https://github.com/Jongchan/attention-module/blob/5d3a54af0f6688bedca3f179593dff8da63e8274/MODELS/bam.py#L9-L12
gate_activation
is not defined in code
Apparently you don't need it. Just remove that part from the code, it's not used anywhere except for that line.