HieCoAttenVQA icon indicating copy to clipboard operation
HieCoAttenVQA copied to clipboard

Connections with transformers?

Open askerlee opened this issue 1 year ago • 0 comments

Just came across your paper, and found that the formulation of co-attention is quiote similar to transformers: image Especially, a few (but not all) major ingredients, i.e., Q, V projections, attention computed with softmax after dot-product, also appear in transformers.

Considering your work was earlier than the transformer paper, do you think that it may have inspired transformers? Thanks.

askerlee avatar Sep 13 '22 08:09 askerlee