ramalama
ramalama copied to clipboard
Any plans to incorporate ipex-llm
I see the support for openvino, are there any plans to support https://github.com/intel/ipex-llm It has been reported to me that ollama+ipex-llm is very fast on intel ultra iGpu
PRs welcome @mecattaf
Why did you close the issue? I'm struggeling with current intel-gpu integration and ipex-llm might be the solution. See my comment here.
Maybe have a look at
https://github.com/eleiton/ollama-intel-arc and how this project incoperates ipex-llm. This is docker image works for me but I would prefer to use ramalama if ipex-llm support is possible.