Transformer_Relative_Position_PyTorch
Transformer_Relative_Position_PyTorch copied to clipboard
Implement the paper "Self-Attention with Relative Position Representations"
Transformer_Relative_Position_Self_Attention
Pytorch implementation of the paper "Self-Attention with Relative Position Representations"
For the entire Seq2Seq framework, you can refer to this repo.