Mask_RCNN icon indicating copy to clipboard operation
Mask_RCNN copied to clipboard

How can I add an attention mechanism to Mask R-CNN? Is there a specific code implementation available?

Open xiaoguaishoubaobao opened this issue 1 year ago • 1 comments

xiaoguaishoubaobao avatar Jan 29 '24 06:01 xiaoguaishoubaobao

@xiaoguaishoubaobao I think you will have to inject your own attention blocks at the stage you want. You need to look into model.py file and introduce your attention blocks at the place you want. It would be nice to implement an attention integrated version of this. But I guess, it's not so generic especially the placement of attention module. It could be at FPN outputs or ResNet, etc. I once extracted FPN+ResNet part of the model and fed FPN output features into CBAM in my Siamese Network model. It is possible.

nyinyinyanlin avatar Feb 22 '24 17:02 nyinyinyanlin