jianjungu
jianjungu
Hi I'm running the ipex-llm with "ipex-2.1.10+xpu" wheel with python 3.10 on Ultra 155H laptop with Win11. **Issue 1.** Everytime I setup the ipex-llm on a laptop that has been...
when I run with cmd 'ollama --version' It always returns with 0.0.0.0 Please add version or tag number to ollama.exe so we can do better management in our project.
The official ollama supports this model in v0.3.4 [https://github.com/ollama/ollama/releases/tag/v0.3.4](https://github.com/ollama/ollama/releases/tag/v0.3.4) Tried with ollama in 2.1.0b20240820, but failed with 0xc0000005 ` time=2024-08-21T13:10:17.961+08:00 level=INFO source=memory.go:133 msg="offload to gpu" layers.requested=-1 layers.real=25 memory.available="40.5 GiB" memory.required.full="1.1...