LocalAI
LocalAI copied to clipboard
Crash when asking a question too big
I tried to use LocalAI as a backend for https://github.com/madox2/vim-ai , but when feeding a request that is bigger than the README examples, it takes forever before crashing with :
ggml_gptj_new_tensor_impl: not enough space in the context's memory pool (needed 1073907088, available 1073741824)
fatal error: unexpected signal during runtime execution
[signal SIGSEGV: segmentation violation code=0x2 addr=0x48 pc=0x1031386a0]
see attached redacted.txt stacktrace
Hi, @MartyLake.
It means that the context's memory pool is full and cannot allocate a new tensor of size 1073907088 bytes. The context's memory pool is used to store temporary data, such as tensors that are used during the computation of a graph. If the context's memory pool is full, it can cause the graph to fail to execute.
And we need to investigate more deeply.
:warning::warning::warning::warning::warning:
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful :smile_cat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
:warning::warning::warning::warning::warning:
It seems like the issue is related to memory allocation. LocalAI may be running out of memory due to the size of the input request. To troubleshoot this, you can try increasing the amount of memory available to the process by setting the GOMAXPROCS
environment variable to a higher value before starting LocalAI. For example, if you are using Docker, you can add the following command to your Dockerfile or Docker run command:
env GOMAXPROCS=4
This will increase the number of available CPU cores for the process, which may help with memory allocation. Additionally, you can try reducing the size of the input request to see if that helps. If the issue persists, please provide more information about the specific model and input data being used so we can investigate further.
Sources:
- https://localai.io/basics/getting_started/index.html
- https://github.com/go-skynet/LocalAI/tree/master/.github/ISSUE_TEMPLATE/bug_report.md
- https://github.com/go-skynet/LocalAI/tree/master/api/openai/completion.go