OpenPrompt
OpenPrompt copied to clipboard
Is your implementation of Prefix tuning consistent with that of Li & Liang?
Acoording to your source code(https://github.com/thunlp/OpenPrompt/blob/0d7774c9bd537c96512a22ada1b3c9bf466df8f2/openprompt/prompts/prefix_tuning_template.py#L188), it seems that you only add key-value pairs for self-attention in the encoder and decoder but ignore the cross-attention in decoder. If my understanding is correct, your implementation seems to be different from the original paper, because the original paper and some subsequent work add prefixes to all three types of attention for the encoder-decoder architecture LMs.
Do not know if I understand correctly? If anyone else has noticed this problem, we can discuss it together.
Same question.