pytorch-openai-transformer-lm
pytorch-openai-transformer-lm copied to clipboard
So we can not change the word embedding with the pretrained LM?
And training the LM is very hard...?
I do not think that you can change the word embedding easily since its dimension must be the same as the output of each layer, in the case of the pre-trained model 768
(cfg.n_embd
).
Training a new language model from scratch is indeed quite expensive and tedious.