Results 9 comments of Ye Bai

I observed the same phenomenon. Do you compute PPL with your language model? I found that the performance of my RNNLM trained with lingvo is not good enough.

I think it should be log_softmax rather than log logits.

It should be log_pplx_per_word. The training ppl of your model (exp(5) = 143) is relatively big. I found that the LM trained with lingvo does not perform well. I don't...

I have not train LM on Librispeech. However, in my experiments, the PPL is worse than 3gram. I am still tuning it.

It is difficult to integrate 3gram into lingvo asr decoder, so I have not done it.

It is our own data. So the absolute PPL value is not comparable.

> Soga~ I have been thinking about why lm fusion on lingvo is not good. We can communicate more in the future~ Hi, I would like to know how you...

Thank you very much! Very helpful information!

I also found that it is very slow...