purswaninuri

Results 4 comments of purswaninuri

Hi, I am also having the same issue - I get a perplexity score of 17 with knn-lm on the wikipedia dataset. Retomaton is within the ballpark of 13 and...

> I set the parameter knn_gpu to False with a perplexity value of 12.5734, and when I set knn_gpu to True, the perplexity value becomes 17.3421. Is my issue a...

Thanks for your help and prompt response. Noted on this.

Hi, I was able to reproduce your perplexity score of 12.57 for the GPT-2 finetuned model from the HF page on CPU settings for knnlm (with default wrapper parameters).