ollama
ollama copied to clipboard
Cannot run ollama on my server using the docker image, error 132
Hello,
This is the first time I am facing such an issue, I cannot run the container at all, it crashes right when it is deployed. I don't know which information should be useful to debug that issue, my host is a debian 12 server with docker 25 ce
I was first deploying using a compose file but I switched back to the docker command line to double check:
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
It creates a volume, but container crashes with error code 132:
State
Dead false
Error
ExitCode 132
FinishedAt 2024-01-21T10:24:09.726297577Z
OOMKilled false
Paused false
Pid 0
Restarting false
Running false
StartedAt 2024-01-21T10:24:09.724212624Z
Status exited
Then I have no clue to identify what is going on, I was not able to find a reference to error 132 in the source code, that could help me do some further checks. Maybe you will have some ideas ! Thanks !
I have the same exact issue
I guess it has something to do with the support of AVX instructions. I am using an Intel Gold 6400 which is socket 1200, Cornet Lake gen, but only supports SSE 4.1 and 4.2, contrary to the i5 I also have, same socket and gen, but which supports AVX. If someone can confirm ... thanks !
@GuiPoM can you try running without daemon mode (drop the -d
flag) to see if there is any output before the exit/crash?
Also make sure to pull the image (docker pull ollama/ollama
) to make sure you get the latest version.
@GuiPoM can you try running without daemon mode (drop the
-d
flag) to see if there is any output before the exit/crash?Also make sure to pull the image (
docker pull ollama/ollama
) to make sure you get the latest version.
Thank you for your answer. I do not know if you made the link with the other conversation we had in the issue #1279 about support of CPUs without AVX, but the rc image you shared with me is working fine. I made it working on this platform, CPU without AVX, no GPU. Another one, CPU with AVX, but no GPU. And a final one, CPU with AVX and with nVidia GPU, and all three are starting fine.
So I guest the "latest" ollama image is now old and does not provide the latest enhancement to have it deployed.
I can do the check without -d
if you think it is useful, but as the rc image works, I guess we can say my issue is closed, right ?
Great to hear the latest release is working for you!
So I guest the "latest" ollama image is now old and does not provide the latest enhancement to have it deployed.
We do update the latest tag on every release, but depending on your container runtime and how you run the container, "latest" can grow stale on your system. If you docker pull ollama/ollama
that will ensure you're picking up the actual latest image from Docker Hub.
It sounds like we can close this now.