AttaNet
AttaNet copied to clipboard
Structural issues with the Strip Attention Module
When I was looking at the SAM part of the code and the paper, I found that the operations of Q and K do not correspond to each other. If I just swap the names of Q and K in the code, it is still not correct because the subsequent transpose operation is still performed by Q.