GuiPoM
GuiPoM
> With [release 0.1.21](https://github.com/jmorganca/ollama/releases/tag/v0.1.21) we now support multiple CPU optimized variants of the LLM library. The system will auto-detect the capabilities of the CPU and select one of AVX2, AVX,...
> We haven't pushed an official updated image yet, since [0.1.21](https://github.com/ollama/ollama/releases/tag/v0.1.21) is still a pre-release while we squash a few final bugs. > > If you're eager to try it...
> @GuiPoM we've recently added ROCm support to the container image, which required switching the base layer to include the ROCm libraries, which unfortunately are quite large. We'd prefer to...
I guess it has something to do with the support of AVX instructions. I am using an Intel Gold 6400 which is socket 1200, Cornet Lake gen, but only supports...
> @GuiPoM can you try running without daemon mode (drop the `-d` flag) to see if there is any output before the exit/crash? > > Also make sure to pull...