nmt icon indicating copy to clipboard operation
nmt copied to clipboard

How to use BERT contextualized embeddings

Open spookypineapple opened this issue 5 years ago • 2 comments

Is it possible to use BERT based contextualized word embeddings along with the nmt implementation? I want to take advantage of the pretrained BERT language model so the NMT weights can be leveraged more to train for the task im solving instead of being used to account for language model (grammer, structure, etc..)

spookypineapple avatar Apr 08 '19 22:04 spookypineapple

Same question but with ELMo, I'm guessing you'd have to modify the embedding lookup

hichiaty avatar Apr 11 '19 16:04 hichiaty

@hichiaty @spookypineapple wondering you were able to use BERT OR ELMo embedding?

nashid avatar Feb 23 '22 17:02 nashid