private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

Not enough space in the context's memory pool

Open JosephShenV opened this issue 2 years ago • 1 comments

Note: if you'd like to ask a question or open a discussion, head over to the Discussions section and post it there.

Describe the bug and how to reproduce it

When running on M1 Macbook pro (16G of memory), the program seems to be only able to answer maximum 3 questions on my M1 Macbook pro. Is there a way to release memory after each generation of response?

Enter a question: what events will lead to the changes of Pension Plan? gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' ggml_new_tensor_impl: **not enough space in the context's memory pool (needed 5283635968, available 5243946400) zsh: segmentation fault **

Expected behavior A clear and concise description of what you expected to happen.

Program should be able to continuously generate answers for more questions.

Environment (please complete the following information):

  • OS / hardware: [e.g. macOS 12.6 / M1]
  • Python version [e.g. 3.11.3]
  • Other relevant information

Additional context Add any other context about the problem here.

JosephShenV avatar May 19 '23 17:05 JosephShenV

This is a duplicate of https://github.com/imartinez/privateGPT/issues/104 https://github.com/imartinez/privateGPT/issues/170 https://github.com/imartinez/privateGPT/issues/181

PulpCattel avatar May 19 '23 17:05 PulpCattel