pytorch-original-transformer icon indicating copy to clipboard operation
pytorch-original-transformer copied to clipboard

Frequency in the positional encodings

Open FAhtisham opened this issue 4 years ago • 0 comments

What does the frequency represent in positional encoding ? Why do we need to multiply it with the positional values?

frequencies = torch.pow(10000., -torch.arange(0, model_dimension, 2, dtype=torch.float) / model_dimension)

FAhtisham avatar Jul 23 '21 20:07 FAhtisham