do-you-even-need-attention icon indicating copy to clipboard operation
do-you-even-need-attention copied to clipboard

Exploring whether attention is necessary for vision transformers

Results 2 do-you-even-need-attention issues
Sort by recently updated
recently updated
newest added

Hi Luke, thank you for sharing this amazing work. In your [arxiv document](https://arxiv.org/pdf/2105.02723.pdf), I cannot find any mention of positional encoding, but I see that you use them in your...

Hi, I was going through your exp report. You have made a point that since you are able to get a good performance without using attention layer so good performance...