private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

[Mem ERROR] Segmentation Fault

Open jpsanchezl10 opened this issue 1 year ago • 12 comments

ggml_new_tensor_impl: not enough space in the context's memory pool (needed 15950137152, available 15919123008) zsh: segmentation fault python privateGPT.py

jpsanchezl10 avatar May 16 '23 18:05 jpsanchezl10

Getting the error as well ggml_new_tensor_impl: not enough space in the context's memory pool (needed 8230752336, available 8223461408)

215carlos avatar May 16 '23 19:05 215carlos

Same here.

ggml_new_tensor_impl: not enough space in the context's memory pool (needed 7177413200, available 7099758608)

I'm on a Mac M1

MikeCraig418 avatar May 16 '23 19:05 MikeCraig418

M2 pro for me

jpsanchezl10 avatar May 17 '23 00:05 jpsanchezl10

Here also on Ubuntu 22.04.2 LTS.

ghdot avatar May 17 '23 07:05 ghdot

I was able to solve this for some of my queries by adding a swap file in linux. I made a 8GB (8192) swap file, but I am thinking a 16GB swap file maybe better.

https://askubuntu.com/questions/126018/adding-a-new-swap-file-how-to-edit-fstab-to-enable-swap-after-reboot

These ares the steps to create a swap on a file:

Create a large file e.g. with

sudo mkdir -p /var/cache/swap/ # create a directory that holds the swap file sudo dd if=/dev/zero of=/var/cache/swap/myswap bs=1M count=4096 # for 4 GByte

Of course any other method of creating a file of defined size would do.

Announce swap to the system

sudo chmod 0600 /var/cache/swap/myswap # only root should have access sudo mkswap /var/cache/swap/myswap # format as swap sudo swapon /var/cache/swap/myswap # announce to system

Insert the following line in /etc/fstab for swap from the next boot:

/var/cache/swap/myswap none swap sw 0 0

Note: In case you have your system files on a SSD you may want to consider to hold your swap file on a hard disk location.

bearney74 avatar May 17 '23 14:05 bearney74

@bearney74

some of my queries

What do you mean by "some"? Can you give an example of when it would crash based on the query?

MikeCraig418 avatar May 17 '23 18:05 MikeCraig418

I was able to solve this for some of my queries by adding a swap file in linux. I made a 8GB (8192) swap file, but I am thinking a 16GB swap file maybe better.

https://askubuntu.com/questions/126018/adding-a-new-swap-file-how-to-edit-fstab-to-enable-swap-after-reboot

These ares the steps to create a swap on a file:

Create a large file e.g. with

sudo mkdir -p /var/cache/swap/ # create a directory that holds the swap file sudo dd if=/dev/zero of=/var/cache/swap/myswap bs=1M count=4096 # for 4 GByte

Of course any other method of creating a file of defined size would do.

Announce swap to the system

sudo chmod 0600 /var/cache/swap/myswap # only root should have access sudo mkswap /var/cache/swap/myswap # format as swap sudo swapon /var/cache/swap/myswap # announce to system

Insert the following line in /etc/fstab for swap from the next boot:

/var/cache/swap/myswap none swap sw 0 0

Note: In case you have your system files on a SSD you may want to consider to hold your swap file on a hard disk location.

Can You install mkswap on mac os ?

jpsanchezl10 avatar May 17 '23 18:05 jpsanchezl10

Same error here: ggml_new_tensor_impl: not enough space in the context's memory pool (needed 6996085024, available 6929500608) Segmentation fault (core dumped)

System: model name : 13th Gen Intel(R) Core(TM) i5-13600K MemTotal: 32623700 kB PRETTY_NAME="Ubuntu 22.04.2 LTS"

PaulWeiss avatar May 17 '23 21:05 PaulWeiss

Same issue on a cloud instance: ggml_new_tensor_impl: not enough space in the context's memory pool (needed 6996085024, available 6929500608) Segmentation fault (core dumped)

System: model name : AMD EPYC 7R32 MemTotal: 195784944 kB PRETTY_NAME="Ubuntu 22.04.2 LTS"

PaulWeiss avatar May 17 '23 21:05 PaulWeiss

Same issue on intel Mac and M1 Mac, adjusting embedding size from 1kb to 25mb and same result.

mruckman1 avatar May 18 '23 03:05 mruckman1

Try tweaking the chunk_size. Bumping from 500 to 1000 seems to work on my M1 32GB.

sebastienpires avatar May 18 '23 11:05 sebastienpires

I get this error every other query into a run. Mac M2 16GB.

Edit: WizardLM doesn't seem to put as much pressure on memory initially, so is the gpt4all model part of the issue?

AntonioCiolino avatar May 18 '23 17:05 AntonioCiolino