NCRFpp icon indicating copy to clipboard operation
NCRFpp copied to clipboard

about bert

Open ypc-stu opened this issue 5 years ago • 4 comments

Hello, I want to introduce a pre-training vector (trained with bert) in demo.train.config file, word_emb_dir=xxx(bert), where does NCRF++ need to be changed? Or when do you release bert embedding?

ypc-stu avatar Nov 15 '19 09:11 ypc-stu

Good question. I planned to incorporate those various pretrained embeddings half a year ago but couldn't find enough time to implement it. I am not sure when can I release it, but I will definitely work on this.

jiesutd avatar Nov 18 '19 03:11 jiesutd

Hi, I'm also thinking of using BERT embedding on NCRF++. Since I saw someone already requested this, just checking in to know if it's implemented?

CHENPoHeng avatar May 27 '20 15:05 CHENPoHeng

@CHENPoHeng Hi , we have implemented an initial version of integrating different BERT (using hugging face) in the dev version. But we haven't evaluated it. We plan to have a detailed evaluation and merge it into the master branch.

jiesutd avatar May 27 '20 21:05 jiesutd

hi everyone, is it possible to use contextual word embedding (Bert, Elmo,..) I have worked on sequence tagging using NCRF ++ with default word embedding, I am wondering how to use Bert there? Many thanks

myeghaneh avatar Feb 15 '21 15:02 myeghaneh

Hello everyon, I know it is too late but we have update NCRF++ to YATO(https://github.com/jiesutd/YATO) which can fully utilize large pretrained language models.

jiesutd avatar Jun 29 '23 18:06 jiesutd