ipex-llm icon indicating copy to clipboard operation
ipex-llm copied to clipboard

Hope to continue updating for Ollama.

Open yizhangliu opened this issue 3 months ago • 5 comments

https://github.com/ipex-llm/ipex-llm/releases/tag/v2.3.0-nightly Hope to continue updating.

yizhangliu avatar Sep 09 '25 09:09 yizhangliu

hope so.

kflnig avatar Sep 10 '25 11:09 kflnig

I see that Ollama is now working on a Vulkan backend (ollama/ollama#11835). Also, the latest Intel AI-Playground update switched from SYCL to Vulkan: https://github.com/intel/AI-Playground/releases.

I’m curious about the performance differences between these and IPEX-LLM. Initially, it was quite slow, but I think the recent updates have improved it.

danielmayost avatar Sep 11 '25 07:09 danielmayost

I’m curious about the performance differences between these and IPEX-LLM

Here is some tests Vulkan and SYCL from llama.cpp (A770) on Windows

Image

https://github.com/ggml-org/llama.cpp/discussions/10879#discussioncomment-14467566

And I do not know how to assemble it correctly llama.cpp and ollama to get good performance as in ipex-llm. https://github.com/intel/ipex-llm/issues/13309

savvadesogle avatar Sep 21 '25 12:09 savvadesogle

https://github.com/ollama/ollama/pull/11835#issuecomment-3283241149

danielmayost avatar Sep 21 '25 13:09 danielmayost

https://github.com/intel/ipex-llm/issues/13308#issuecomment-3319233504

Ellie-Williams-007 avatar Sep 22 '25 14:09 Ellie-Williams-007