Mask_RCNN
Mask_RCNN copied to clipboard
How can I add an attention mechanism to Mask R-CNN? Is there a specific code implementation available?
@xiaoguaishoubaobao I think you will have to inject your own attention blocks at the stage you want. You need to look into model.py file and introduce your attention blocks at the place you want. It would be nice to implement an attention integrated version of this. But I guess, it's not so generic especially the placement of attention module. It could be at FPN outputs or ResNet, etc. I once extracted FPN+ResNet part of the model and fed FPN output features into CBAM in my Siamese Network model. It is possible.