minbert-assignment icon indicating copy to clipboard operation
minbert-assignment copied to clipboard

A little confusion about the positional encoding.

Open soumitrapy opened this issue 2 years ago • 0 comments

Sir, I am a little bit confused about the positional encoding part in the bert.py file. Can you please explain that?

As described in the transformer paper, the positional encoding is some sin, cos function of the positions. But here, you used nn.Embedding layer. why is this used for positional encoding?

soumitrapy avatar Jun 26 '23 15:06 soumitrapy