Results 12 comments of Sid

I've seen this behavior in #2411, but only with the version from ollama.com. Try it with the latest released binary? https://github.com/ollama/ollama/releases/tag/v0.1.27

> Ollama skipped the iGPU, because it has less than 1GB of VRAM. You have to configure VRAM allocation for the iGPU in BIOS to something like 8GB. Detecting and...

> Do i need another amdgpu module on the host than the one from the kernel (6.7.6)? Maybe, https://github.com/ROCm/ROCm/issues/816 seems relevant. I'm just using AMD-provided DKMS modules from https://repo.radeon.com/amdgpu/6.0.2/ubuntu to...

Erm, the end of my comment was a question, not a statement. I personally feel that it would be disrespectful towards the esteemed experts and maintainers to swamp them with...

I also have a Radeon RX 7900 XTX, and I've compiled ollama with `export AMDGPU_TARGETS=gfx1100` and CLblast_DIR, all according to development.md, but ollama fails to detect the GPU with a...

I've compiled main at commit 1e23e82 with some added print statements, and the GPU was detected, but still not used, logs say I'm missing `libnuma.so.1`, yet APT says "libnuma-dev is...

Thank you, @remy415 - I've been able to solve all issues. @haplo - I could help you build and test, we have the same GPU, but different CPU. - rebuilding...

@DocMAX As I stated, rebuilding amdgpu-dkms (after upgrading ROCm to 6.0.2) allowed me to use the driver version with the `/sys/module/amdgpu/version` interface present. It doesn't give you anything but the...

Does the loop respect `systemd`'s RestartSec=3 setting? You could diagnose by changing the `ollama.service` file and setting `ExecStart=ollama serve` to run a wrapper script instead, for example to hold the...

I've had this exact issue frequently with ROCm 5.7 on Radeon RX 7900 XTX. Upgrading to ROCm 6.0 has solved it for me. At least Stable Diffusion with torch-2.3.0+rocm5.7 still...