private-gpt
private-gpt copied to clipboard
Warnnings (gpt_tokenize: unknown token 'Ö') & (ggml_new_tensor_impl: not enough space in the context's memory pool (needed 7530291008, available 7525403600))
Hi, great work! I wanted to report these warning messages and errors when running in windows: gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' ggml_new_tensor_impl: not enough space in the context's memory pool (needed 7530291008, available 7525403600) This was while using the same default input document. Thanks, JJ
@ivicajerkovic I think it's because you don't have enough RAM. Try to kill the processes that you are not using to free your RAM
Thanks Terence.
Hi, @terencebeauj , do you happen to know how many RAM does it need? I ran a test in a computer with 32GB of RAM and it throws the same exception. Also, it throws the warnings of unknown token as well, even with the default document.
Hi @ezaca, no I don't know, my computer also have 32GB of memory and it is working for me, even though I am using WSL2 which consumes quite a lot memory. But I have to close some other processes like the web browser for example. I also have the warnings, those aren't importants.
I get this same error after 2 or 3 queries. It appears to be an issue in llama.cpp: https://github.com/ggerganov/llama.cpp/issues/52.
same problem. I have 16GB ram.
I have the same problem and running on a brand new M2 with 16 GB ram.
gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' gpt_tokenize: unknown token '?' ggml_new_tensor_impl: not enough space in the context's memory pool (needed 11906122816, available 11901034208) zsh: segmentation fault python3 privateGPT.py exonarc@Exonarcs-MacBook-Pro privateGPT % /Users/exonarc/.pyenv/versions/3.11.2/lib/python3.11/multiprocessing/resource_tracker.py:224: UserWarning: resource_tracker: There appear to be 1 leaked semaphore objects to clean up at shutdown warnings.warn('resource_tracker: There appear to be %d '
不知道多大的内存或者现存才能运行这个程序。。。我也是32G
Similar kind of problem: gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token '£' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token '¥' gpt_tokenize: unknown token 'Γ' gpt_tokenize: unknown token 'Ç' gpt_tokenize: unknown token 'Ö'