ollama
ollama copied to clipboard
model requires more system memory than is available when useMmap
What is the issue?
When I use continue vscode extension to call ollama config like
{
"model": "qwen2.5-coder:14b",
"title": "qwen2.5-coder:14b",
"provider": "ollama",
"completionOptions": {
"keepAlive": 9999999,
"useMmap": true
}
},
It still checks system memory disregard the "useMmap": true option. And return 500 internal error like:
{"error":"model requires more system memory (17.7 GiB) than is available (13.6 GiB)"}
OS
Windows
GPU
No response
CPU
Other
Ollama version
0.4.7
mmap doesn't affect the check for memory. If your system doesn't have enough system memory to load the model, you need to increase it by adding swap.
No the logic is wrong. Both Windows and macos does swap adding dynamically automatically as I mentioned at https://github.com/ollama/ollama/issues/6918#issuecomment-2488221380 . So when a user knows what mmap is and ask for the config you shoiuld skip the checking.
Has the latest windows version update changed the useMmap behavior? It doesn' t seem to mmap but copied to memory?
EDIT: Only first run have this issue. Not happening now.
And the Ollama app doesn' t support useMmap.