attention-module
attention-module copied to clipboard
Question about Placement
Thank you for sharing this beautiful work. I have three convolutional layers followed by a global average pooling and then finally a linear layer. Do you recommend placing the BAM / CBAM blocks after each convolutional layer or only after the first two of them? Lastly, do you have a comparison with Gather-Excite? https://arxiv.org/pdf/1810.12348.pdf