private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

I got this error after ingest Segmentation fault

Open rexzhang2023 opened this issue 2 years ago • 7 comments

Using embedded DuckDB with persistence: data will be stored in: db gptj_model_load: loading model from 'models/ggml-gpt4all-j-v1.3-groovy.bin' - please wait ... gptj_model_load: n_vocab = 50400 gptj_model_load: n_ctx = 2048 gptj_model_load: n_embd = 4096 gptj_model_load: n_head = 16 gptj_model_load: n_layer = 28 gptj_model_load: n_rot = 64 gptj_model_load: f16 = 2 gptj_model_load: ggml ctx size = 4505.45 MB gptj_model_load: memory_size = 896.00 MB, n_mem = 57344 gptj_model_load: ................................... done gptj_model_load: model size = 3609.38 MB / num tensors = 285

Enter a query: what is headline on wsj today gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' gpt_tokenize: unknown token '�' ggml_new_tensor_impl: not enough space in the context's memory pool (needed 16137658480, available 16089381008) Segmentation fault

rexzhang2023 avatar May 22 '23 04:05 rexzhang2023

i got same error if enquiry more than 2 times.

jamsnrihk avatar May 22 '23 05:05 jamsnrihk

You don't have enough RAM. A solution would be setting use_mmap to False

maozdemir avatar May 22 '23 06:05 maozdemir

Same problem. I have 128gb of ram and I'm testing the project on 20 text documents.

prestaino avatar May 22 '23 09:05 prestaino

You don't have enough RAM. A solution would be setting use_mmap to False

...since I have 1/4 TB of RAM and had that error after just ingesting the example, what is "enough" RAM? Any idea?

yalla avatar May 22 '23 09:05 yalla

Same issue, Macbook Pro M1 Pro; 16GB RAM

jannikstdl avatar May 22 '23 21:05 jannikstdl

You don't have enough RAM. A solution would be setting use_mmap to False

...since I have 1/4 TB of RAM and had that error after just ingesting the example, what is "enough" RAM? Any idea?

yikes... i have no idea then...

maozdemir avatar May 23 '23 00:05 maozdemir

Checkout this discussion, and I have been able to run most of the models, which were not being run till now. And they are not crashing

https://huggingface.co/TheBloke/MPT-7B-Instruct-GGML/discussions/2

llama-cpp-python==0.1.53 ctransformers==0.2.0

gaurav-cointab avatar May 23 '23 05:05 gaurav-cointab