bert-as-language-model icon indicating copy to clipboard operation
bert-as-language-model copied to clipboard

Probability of the last word is always too small

Open AliceZhang2016 opened this issue 5 years ago • 2 comments

Hi, after seeing the result of predicting several Chinese phrases, I found that the probability of the last word is always too small compared to other words in the same phrase. This also happens in all examples shown in your readme.md. Therefore, the perplexity of phrases become also very high.

What do you think about this phenomenon? Thanks for your attention.

AliceZhang2016 avatar Jul 08 '19 02:07 AliceZhang2016

你试试在句子最后加一个句号...

liuhe6 avatar Oct 21 '19 03:10 liuhe6

确实是有这个问题

aaronliu7 avatar Sep 03 '20 08:09 aaronliu7