kenlm
kenlm copied to clipboard
Entropy
Hi, is there a function call to calculate entropy directly from KenLM? What is the maths formula of model.score ? Is it the log probability or probability?
Is it true that in your code:
entropy(sent) = - model.score(sent) / length(sent) ? perplexity (sent) = 10 ^ (- model.score(sent) / length(sent) )?
Thanks!