BERT-CCPoem icon indicating copy to clipboard operation
BERT-CCPoem copied to clipboard

Question about pretraining and special tokens

Open King-of-Infinite-Space opened this issue 1 year ago • 0 comments

Thanks for releasing this model. I hope the authors can provide more information regarding the following questions:

  1. Was this model trained in the same way as the original BERT paper, i.e. masked LM and NSP?
  2. What was the format of input sequences used in training? Were they complete sentences (e.g. couplets)?
  3. What is the meaning of token * and # in the vocabulary?

King-of-Infinite-Space avatar Mar 01 '23 21:03 King-of-Infinite-Space