ResidualAttentionNetwork-pytorch icon indicating copy to clipboard operation
ResidualAttentionNetwork-pytorch copied to clipboard

Mixed attention、Channel attention and Spatial attention

Open YANYANYEAH opened this issue 5 years ago • 4 comments

Hello, I studied your code carefully, and then I found that there are different formulas for Mixed Attention, Channel Attention and Spatial Attention in the paper. But I don't see a formal representation of F (xi, c) in your code. I just started to learn about Deep Networks. How do I modify the network if I want to express different attentions? Thank you!

YANYANYEAH avatar Apr 09 '19 12:04 YANYANYEAH

sorry, I read the article carefully and found the following paragraph. "Mixed attention f1 without additional restriction use simple sigmoid for each channel and spatial position. Channel attention f2 performs L2 normalization within all channels for each spatial position to remove spatial information. Spatial attention f3 performs normalization within feature map from each channel and then sigmoid to get soft mask related to spatial information only." but I don't know exactly how to do F2 and F3. Suppose the feature size is [batch_size, channel, height, width]. Does F2 use nn. Batch Norm2d (channel) to normalize each channel? Does F3 use nn. BatchNorm2d (hetght * width) to normalize each spatial location and then sigmoid?

YANYANYEAH avatar Apr 10 '19 02:04 YANYANYEAH

you can treat Squeeze and Excitation Network as Channel attention f2, with global pooling for each channel, afterward with the MLP for output weight of each channel. While spatial attention means each pixel in every feature map has its weight.

tengshaofeng avatar Apr 15 '19 08:04 tengshaofeng

Thank you very much for your answer. I have been trying to solve it with normalization, and I will try it on each method.

YANYANYEAH avatar Apr 16 '19 03:04 YANYANYEAH

Hello, I feel confused about F1 attention, so does it mean using conv->relu->conv->sigmoid to operate on feature maps?

fengshenfeilian avatar Jun 14 '19 07:06 fengshenfeilian