minGPT icon indicating copy to clipboard operation
minGPT copied to clipboard

About layer norm dimention parameter:

Open vcvycy opened this issue 1 year ago • 1 comments

In model.py, the implementation of layer norm is : self.ln_1 = nn.LayerNorm(config.n_embd) If batch_size = 64, block_size = 6, embedding_size = 48, then the shape of input is [64, 6, 48], and the layer norm parameter is [48],

But in my opinion, I think the layer norm dimention should be [6, 48], do I wrong?

vcvycy avatar Jun 07 '23 15:06 vcvycy

On a model like GPT, we should be doing normalization on the vectors that correspond to the individual tokens within the blocks/sequence. Because we want to normalize the vectors for the tokens and center the vector within the scope of that token.

What we don't want to be doing is normalizing it across (blocks, and tokens) as this would center the entire sequence. This will mean each token will lose its individual characteristics and importance and all the words will be treated as the same or similar words.

This may make the attention mechanism go blurry and give out gibberish hallucinating results.

rjarun8 avatar Jun 28 '23 15:06 rjarun8