Portable Builds Upstream Version Update Request
The portable build for Ollama with IPEX LLM built in, including the latest weekly build, needs to be based on a newer upstream Ollama release to support the latest LLMs from Google and Meta, etc. Please consider building based on a newer upstream Ollama version.
C:\Users\a_user\Portable\ollama-ipex-llm-2.3.0b20250429-win>ollama pull gemma3:1b pulling manifest pulling 7cd4618c1faf... 100% ▕████████████████████████████████████████████████████████▏ 815 MB pulling e0a42594d802... 100% ▕████████████████████████████████████████████████████████▏ 358 B pulling dd084c7d92a3... 100% ▕████████████████████████████████████████████████████████▏ 8.4 KB pulling 3116c5225075... 100% ▕████████████████████████████████████████████████████████▏ 77 B pulling 120007c81bf8... 100% ▕████████████████████████████████████████████████████████▏ 492 B verifying sha256 digest writing manifest success
C:\Users\a_user\Portable\ollama-ipex-llm-2.3.0b20250429-win>ollama run gemma3:1b Error: llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade
Hi @bmh129 , we are working on adding support for gemma3 model, and it will be released next week. Pls stay tuned :)
@sgwhat Is it possible to collaborate with the Ollama team to have Intel gpu devices supported automatically like they do for AMD ROCm and NVIDIA Cuda? They are now at Ollama v0.8.0.
That's a good idea and actually we plan to do so.
Hi all,
We have released the Ollama portable zip with Gemma3 vision support. Please download ollama-ipex-llm-2.3.0b20250612-win/ubuntu.zip from the following link: Ollama Portable Zip Release.
To get started, follow the portable zip guide available here: Ollama Portable Zip Quickstart Guide.
Give it a try and let us know your feedback!
That's a good idea and actually we plan to do so.
Any idea when will this happen?