SONG Ge
SONG Ge
Hi @jaymeanchante, we have reproduced your issue and we are woking on resolving it, will inform you when we make progress.
Hi @jaymeanchante, I can run ollama on windows with Intel Iris Xe (GPU driver 5534) **successfully** now, the reason I was able to reproduce your issue is that the GPU...
> That's the output of mine, still runs on CPU no matter what I do Hi @opticblu, 1. Could you please provide the detailed logs returned by the ollama server?...
Hi @junruizh2021 , I have tested `glm-4-9b-chat.Q4_K_S` on mtl device and it works well. Could you please provide more information from the Ollama Server side and details about your device?...
Hi @junruizh2021 , we have reproduced your issue on Ubuntu Arc770 device, we are looking for a solution and will reply to you soon. You may use `2.2.0b20240910` version for...
Hi @junruizh2021 , you may try our latest version of ipex-llm ollama to run glm4-9b with `pip install --pre --upgrade ipex-llm[cpp]`. We have fixed the output issue when doing multi-turn...
Hi @reeingal, ollama with IPEX-LLM does not support running on a pure CPU platform, as we haven't optimized ollama for CPU. You may switch to a GPU device to enable...
Hi @doucej , could you please provide more information from the ollama server side (like the ollama server log)? This would be helpful for us in addressing the issue and...
hi @doucej, this issue is due to oneAPI not being installed correctly. You may run ollama with following steps: 1. Please run `sycl-ls` to check your sycl devices. The expected...
Hi @doucej , this failure of Ollama to run is because you **haven't correctly installed OneAPI**. If you have installed OneAPI correctly, `[ext_oneapi_level_zero:gpu]` should be present as expected in the...