ipex-llm icon indicating copy to clipboard operation
ipex-llm copied to clipboard

Old version of Ollama

Open Teejer opened this issue 7 months ago • 9 comments

When I try to pull Qwen3 with this, the portable version of ollama seems to be out of date.

ggml_sycl_init: found 1 SYCL devices:
pulling manifest 
Error: pull model manifest: 412: 

The model you are attempting to pull requires a newer version of Ollama.

Please download the latest version at:

	https://ollama.com/download

Teejer avatar May 29 '25 14:05 Teejer

Hi @Teejer , could you please upgrade ollama version first by pip install ipex-llm[cpp]==2.3.0b20250529 and try it again ?

rnwang04 avatar May 30 '25 03:05 rnwang04

the 2.3.0 binaries from the repo do not see my A770's on power up, no GPU available, the 2.2.0 does.

still working out ipex-llm and ollama in python

(ipex-llm-test1) thehoff@overmind:~/ipex-ollama/ipex-llm-test1$ pip install ipex-llm[cpp]==2.3.0b20250529 Collecting ipex-llm==2.3.0b20250529 (from ipex-llm[cpp]==2.3.0b20250529) Using cached ipex_llm-2.3.0b20250529-py3-none-manylinux2010_x86_64.whl.metadata (8.9 kB) INFO: pip is looking at multiple versions of ipex-llm[cpp] to determine which version is compatible with other requirements. This could take a while. ERROR: Could not find a version that satisfies the requirement bigdl-core-cpp==2.7.0b20250529; extra == "cpp" (from ipex-llm[cpp]) (from versions: 2.1.0b2, 2.5.0, 2.6.0b1, 2.6.0b2, 2.6.0b20230911, 2.6.0b20250201, 2.6.0b20250202, 2.6.0b20250203, 2.6.0b20250204, 2.6.0b20250207, 2.6.0b20250209, 2.6.0b20250210, 2.6.0b20250211, 2.6.0b20250212, 2.6.0b20250213, 2.6.0b20250214, 2.6.0b20250215, 2.6.0b20250216, 2.6.0b20250217, 2.6.0b20250218, 2.6.0b20250219, 2.6.0b20250220, 2.6.0b20250221, 2.6.0b20250222, 2.6.0b20250223, 2.6.0b20250224, 2.6.0b20250225, 2.6.0b20250225.post0, 2.6.0b20250226, 2.6.0b20250227, 2.6.0b20250228, 2.6.0b20250230, 2.6.0b20250231, 2.6.0b20250301, 2.6.0b20250302, 2.6.0b20250303, 2.6.0b20250304, 2.6.0b20250305, 2.6.0b20250306, 2.6.0b20250307, 2.6.0b20250308, 2.6.0b20250309, 2.6.0b20250310, 2.6.0b20250311, 2.6.0b20250311.post0, 2.6.0b20250312, 2.6.0b20250312.post0, 2.6.0b20250313, 2.6.0b20250313.post0, 2.6.0b20250314, 2.6.0b20250315, 2.6.0b20250316, 2.6.0b20250317, 2.6.0b20250318, 2.6.0b20250318.post0, 2.6.0b20250319, 2.6.0b20250320, 2.6.0b20250324, 2.6.0b20250325, 2.6.0b20250326, 2.6.0b20250327, 2.6.0b20250328, 2.6.0b20250329, 2.6.0b20250329.post1, 2.6.0b20250329.post2, 2.6.0b20250329.post3, 2.6.0b20250330, 2.6.0b20250331, 2.6.0b20250401, 2.6.0b20250402, 2.6.0b20250403, 2.6.0b20250404, 2.6.0b20250405, 2.6.0b20250406, 2.6.0, 2.7.0b20250407, 2.7.0b20250408, 2.7.0b20250409, 2.7.0b20250410, 2.7.0b20250411, 2.7.0b20250412, 2.7.0b20250413, 2.7.0b20250414, 2.7.0b20250415, 2.7.0b20250416, 2.7.0b20250416.post0, 2.7.0b20250417, 2.7.0b20250418, 2.7.0b20250419, 2.7.0b20250420, 2.7.0b20250421, 2.7.0b20250422, 2.7.0b20250423, 2.7.0b20250425, 2.7.0b20250426, 2.7.0b20250427, 2.7.0b20250428, 2.7.0b20250429, 2.7.0b20250429.post2, 2.7.0b20250430, 2.7.0b20250430.post0, 2.7.0b20250430.post1, 2.7.0b20250501, 2.7.0b20250502, 2.7.0b20250503, 2.7.0b20250504, 2.7.0b20250505, 2.7.0b20250506, 2.7.0b20250507, 2.7.0b20250512, 2.7.0b20250513, 2.7.0b20250514, 2.7.0b20250515, 2.7.0b20250517, 2.7.0b20250518, 2.7.0b20250519, 2.7.0b20250520, 2.7.0b20250521, 2.7.0b20250522, 2.7.0b20250523, 2.7.0b20250524, 2.7.0b20250526, 2.7.0b20250527, 2.7.0b20250528, 2.7.0b20250530, 2.7.0b20250531) ERROR: No matching distribution found for bigdl-core-cpp==2.7.0b20250529; extra == "cpp"

(ipex-llm-test1) thehoff@overmind:~/ipex-ollama/ipex-llm-test1$ pip install ipex-llm[cpp]==2.3.0rc1 Collecting ipex-llm==2.3.0rc1 (from ipex-llm[cpp]==2.3.0rc1) Using cached ipex_llm-2.3.0rc1-py3-none-manylinux2010_x86_64.whl.metadata (8.8 kB) INFO: pip is looking at multiple versions of ipex-llm[cpp] to determine which version is compatible with other requirements. This could take a while. ERROR: Could not find a version that satisfies the requirement bigdl-core-cpp==2.7.0rc1; extra == "cpp" (from ipex-llm[cpp]) (from versions: 2.1.0b2, 2.5.0, 2.6.0b1, 2.6.0b2, 2.6.0b20230911, 2.6.0b20250201, 2.6.0b20250202, 2.6.0b20250203, 2.6.0b20250204, 2.6.0b20250207, 2.6.0b20250209, 2.6.0b20250210, 2.6.0b20250211, 2.6.0b20250212, 2.6.0b20250213, 2.6.0b20250214, 2.6.0b20250215, 2.6.0b20250216, 2.6.0b20250217, 2.6.0b20250218, 2.6.0b20250219, 2.6.0b20250220, 2.6.0b20250221, 2.6.0b20250222, 2.6.0b20250223, 2.6.0b20250224, 2.6.0b20250225, 2.6.0b20250225.post0, 2.6.0b20250226, 2.6.0b20250227, 2.6.0b20250228, 2.6.0b20250230, 2.6.0b20250231, 2.6.0b20250301, 2.6.0b20250302, 2.6.0b20250303, 2.6.0b20250304, 2.6.0b20250305, 2.6.0b20250306, 2.6.0b20250307, 2.6.0b20250308, 2.6.0b20250309, 2.6.0b20250310, 2.6.0b20250311, 2.6.0b20250311.post0, 2.6.0b20250312, 2.6.0b20250312.post0, 2.6.0b20250313, 2.6.0b20250313.post0, 2.6.0b20250314, 2.6.0b20250315, 2.6.0b20250316, 2.6.0b20250317, 2.6.0b20250318, 2.6.0b20250318.post0, 2.6.0b20250319, 2.6.0b20250320, 2.6.0b20250324, 2.6.0b20250325, 2.6.0b20250326, 2.6.0b20250327, 2.6.0b20250328, 2.6.0b20250329, 2.6.0b20250329.post1, 2.6.0b20250329.post2, 2.6.0b20250329.post3, 2.6.0b20250330, 2.6.0b20250331, 2.6.0b20250401, 2.6.0b20250402, 2.6.0b20250403, 2.6.0b20250404, 2.6.0b20250405, 2.6.0b20250406, 2.6.0, 2.7.0b20250407, 2.7.0b20250408, 2.7.0b20250409, 2.7.0b20250410, 2.7.0b20250411, 2.7.0b20250412, 2.7.0b20250413, 2.7.0b20250414, 2.7.0b20250415, 2.7.0b20250416, 2.7.0b20250416.post0, 2.7.0b20250417, 2.7.0b20250418, 2.7.0b20250419, 2.7.0b20250420, 2.7.0b20250421, 2.7.0b20250422, 2.7.0b20250423, 2.7.0b20250425, 2.7.0b20250426, 2.7.0b20250427, 2.7.0b20250428, 2.7.0b20250429, 2.7.0b20250429.post2, 2.7.0b20250430, 2.7.0b20250430.post0, 2.7.0b20250430.post1, 2.7.0b20250501, 2.7.0b20250502, 2.7.0b20250503, 2.7.0b20250504, 2.7.0b20250505, 2.7.0b20250506, 2.7.0b20250507, 2.7.0b20250512, 2.7.0b20250513, 2.7.0b20250514, 2.7.0b20250515, 2.7.0b20250517, 2.7.0b20250518, 2.7.0b20250519, 2.7.0b20250520, 2.7.0b20250521, 2.7.0b20250522, 2.7.0b20250523, 2.7.0b20250524, 2.7.0b20250526, 2.7.0b20250527, 2.7.0b20250528, 2.7.0b20250530, 2.7.0b20250531) ERROR: No matching distribution found for bigdl-core-cpp==2.7.0rc1; extra == "cpp"

thehoff avatar Jun 01 '25 07:06 thehoff

ollama-ipex-llm-2.3.0b20250415-ubuntu - zip/portable from github, no GPU available.

time=2025-06-02T02:31:58.125Z level=INFO source=routes.go:1297 msg="Listening on 127.0.0.1:11434 (version 0.0.0)" time=2025-06-02T02:31:58.125Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs" time=2025-06-02T02:31:58.131Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered" time=2025-06-02T02:31:58.131Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="125.6 GiB" available="123.2 GiB"

dpkg

ii intel-oneapi-mkl-core-2025.1 2025.1.0-801 amd64 Intel® oneAPI Math Kernel Library 2025.1.0 for Linux* core package for Intel(R) 64

gpu list

(ipex-llm-test1) thehoff@overmind:~/ipex-ollama/ollama-ipex-llm-2.2.0-ubuntu$ intel_gpu_top -L card3 Intel Dg2 (Gen12) pci:vendor=8086,device=56A0,card=0 └─renderD129
card2 Intel Dg2 (Gen12) pci:vendor=8086,device=56A0,card=1 └─renderD128

thehoff avatar Jun 02 '25 02:06 thehoff

Hi @thehoff , 2.3.0rc1 is an older version, please try pip install ipex-llm[cpp]==2.3.0b20250530 . The information here is somewhat misleading, and we will consider optimizing this part. In fact, if you load the model, you should see that the model is loaded on the GPU.

rnwang04 avatar Jun 03 '25 01:06 rnwang04

the 2.3.0 binaries from the repo do not see my A770's on power up, no GPU available, the 2.2.0 does.

still working out ipex-llm and ollama in python

(ipex-llm-test1) thehoff@overmind:~/ipex-ollama/ipex-llm-test1$ pip install ipex-llm[cpp]==2.3.0b20250529 Collecting ipex-llm==2.3.0b20250529 (from ipex-llm[cpp]==2.3.0b20250529) Using cached ipex_llm-2.3.0b20250529-py3-none-manylinux2010_x86_64.whl.metadata (8.9 kB) INFO: pip is looking at multiple versions of ipex-llm[cpp] to determine which version is compatible with other requirements. This could take a while. ERROR: Could not find a version that satisfies the requirement bigdl-core-cpp==2.7.0b20250529; extra == "cpp" (from ipex-llm[cpp]) (from versions: 2.1.0b2, 2.5.0, 2.6.0b1, 2.6.0b2, 2.6.0b20230911, 2.6.0b20250201, 2.6.0b20250202, 2.6.0b20250203, 2.6.0b20250204, 2.6.0b20250207, 2.6.0b20250209, 2.6.0b20250210, 2.6.0b20250211, 2.6.0b20250212, 2.6.0b20250213, 2.6.0b20250214, 2.6.0b20250215, 2.6.0b20250216, 2.6.0b20250217, 2.6.0b20250218, 2.6.0b20250219, 2.6.0b20250220, 2.6.0b20250221, 2.6.0b20250222, 2.6.0b20250223, 2.6.0b20250224, 2.6.0b20250225, 2.6.0b20250225.post0, 2.6.0b20250226, 2.6.0b20250227, 2.6.0b20250228, 2.6.0b20250230, 2.6.0b20250231, 2.6.0b20250301, 2.6.0b20250302, 2.6.0b20250303, 2.6.0b20250304, 2.6.0b20250305, 2.6.0b20250306, 2.6.0b20250307, 2.6.0b20250308, 2.6.0b20250309, 2.6.0b20250310, 2.6.0b20250311, 2.6.0b20250311.post0, 2.6.0b20250312, 2.6.0b20250312.post0, 2.6.0b20250313, 2.6.0b20250313.post0, 2.6.0b20250314, 2.6.0b20250315, 2.6.0b20250316, 2.6.0b20250317, 2.6.0b20250318, 2.6.0b20250318.post0, 2.6.0b20250319, 2.6.0b20250320, 2.6.0b20250324, 2.6.0b20250325, 2.6.0b20250326, 2.6.0b20250327, 2.6.0b20250328, 2.6.0b20250329, 2.6.0b20250329.post1, 2.6.0b20250329.post2, 2.6.0b20250329.post3, 2.6.0b20250330, 2.6.0b20250331, 2.6.0b20250401, 2.6.0b20250402, 2.6.0b20250403, 2.6.0b20250404, 2.6.0b20250405, 2.6.0b20250406, 2.6.0, 2.7.0b20250407, 2.7.0b20250408, 2.7.0b20250409, 2.7.0b20250410, 2.7.0b20250411, 2.7.0b20250412, 2.7.0b20250413, 2.7.0b20250414, 2.7.0b20250415, 2.7.0b20250416, 2.7.0b20250416.post0, 2.7.0b20250417, 2.7.0b20250418, 2.7.0b20250419, 2.7.0b20250420, 2.7.0b20250421, 2.7.0b20250422, 2.7.0b20250423, 2.7.0b20250425, 2.7.0b20250426, 2.7.0b20250427, 2.7.0b20250428, 2.7.0b20250429, 2.7.0b20250429.post2, 2.7.0b20250430, 2.7.0b20250430.post0, 2.7.0b20250430.post1, 2.7.0b20250501, 2.7.0b20250502, 2.7.0b20250503, 2.7.0b20250504, 2.7.0b20250505, 2.7.0b20250506, 2.7.0b20250507, 2.7.0b20250512, 2.7.0b20250513, 2.7.0b20250514, 2.7.0b20250515, 2.7.0b20250517, 2.7.0b20250518, 2.7.0b20250519, 2.7.0b20250520, 2.7.0b20250521, 2.7.0b20250522, 2.7.0b20250523, 2.7.0b20250524, 2.7.0b20250526, 2.7.0b20250527, 2.7.0b20250528, 2.7.0b20250530, 2.7.0b20250531) ERROR: No matching distribution found for bigdl-core-cpp==2.7.0b20250529; extra == "cpp"

(ipex-llm-test1) thehoff@overmind:~/ipex-ollama/ipex-llm-test1$ pip install ipex-llm[cpp]==2.3.0rc1 Collecting ipex-llm==2.3.0rc1 (from ipex-llm[cpp]==2.3.0rc1) Using cached ipex_llm-2.3.0rc1-py3-none-manylinux2010_x86_64.whl.metadata (8.8 kB) INFO: pip is looking at multiple versions of ipex-llm[cpp] to determine which version is compatible with other requirements. This could take a while. ERROR: Could not find a version that satisfies the requirement bigdl-core-cpp==2.7.0rc1; extra == "cpp" (from ipex-llm[cpp]) (from versions: 2.1.0b2, 2.5.0, 2.6.0b1, 2.6.0b2, 2.6.0b20230911, 2.6.0b20250201, 2.6.0b20250202, 2.6.0b20250203, 2.6.0b20250204, 2.6.0b20250207, 2.6.0b20250209, 2.6.0b20250210, 2.6.0b20250211, 2.6.0b20250212, 2.6.0b20250213, 2.6.0b20250214, 2.6.0b20250215, 2.6.0b20250216, 2.6.0b20250217, 2.6.0b20250218, 2.6.0b20250219, 2.6.0b20250220, 2.6.0b20250221, 2.6.0b20250222, 2.6.0b20250223, 2.6.0b20250224, 2.6.0b20250225, 2.6.0b20250225.post0, 2.6.0b20250226, 2.6.0b20250227, 2.6.0b20250228, 2.6.0b20250230, 2.6.0b20250231, 2.6.0b20250301, 2.6.0b20250302, 2.6.0b20250303, 2.6.0b20250304, 2.6.0b20250305, 2.6.0b20250306, 2.6.0b20250307, 2.6.0b20250308, 2.6.0b20250309, 2.6.0b20250310, 2.6.0b20250311, 2.6.0b20250311.post0, 2.6.0b20250312, 2.6.0b20250312.post0, 2.6.0b20250313, 2.6.0b20250313.post0, 2.6.0b20250314, 2.6.0b20250315, 2.6.0b20250316, 2.6.0b20250317, 2.6.0b20250318, 2.6.0b20250318.post0, 2.6.0b20250319, 2.6.0b20250320, 2.6.0b20250324, 2.6.0b20250325, 2.6.0b20250326, 2.6.0b20250327, 2.6.0b20250328, 2.6.0b20250329, 2.6.0b20250329.post1, 2.6.0b20250329.post2, 2.6.0b20250329.post3, 2.6.0b20250330, 2.6.0b20250331, 2.6.0b20250401, 2.6.0b20250402, 2.6.0b20250403, 2.6.0b20250404, 2.6.0b20250405, 2.6.0b20250406, 2.6.0, 2.7.0b20250407, 2.7.0b20250408, 2.7.0b20250409, 2.7.0b20250410, 2.7.0b20250411, 2.7.0b20250412, 2.7.0b20250413, 2.7.0b20250414, 2.7.0b20250415, 2.7.0b20250416, 2.7.0b20250416.post0, 2.7.0b20250417, 2.7.0b20250418, 2.7.0b20250419, 2.7.0b20250420, 2.7.0b20250421, 2.7.0b20250422, 2.7.0b20250423, 2.7.0b20250425, 2.7.0b20250426, 2.7.0b20250427, 2.7.0b20250428, 2.7.0b20250429, 2.7.0b20250429.post2, 2.7.0b20250430, 2.7.0b20250430.post0, 2.7.0b20250430.post1, 2.7.0b20250501, 2.7.0b20250502, 2.7.0b20250503, 2.7.0b20250504, 2.7.0b20250505, 2.7.0b20250506, 2.7.0b20250507, 2.7.0b20250512, 2.7.0b20250513, 2.7.0b20250514, 2.7.0b20250515, 2.7.0b20250517, 2.7.0b20250518, 2.7.0b20250519, 2.7.0b20250520, 2.7.0b20250521, 2.7.0b20250522, 2.7.0b20250523, 2.7.0b20250524, 2.7.0b20250526, 2.7.0b20250527, 2.7.0b20250528, 2.7.0b20250530, 2.7.0b20250531) ERROR: No matching distribution found for bigdl-core-cpp==2.7.0rc1; extra == "cpp"

what are you doing bro, just pip install ipex-llm[cpp] --pre.

Ellie-Williams-007 avatar Jun 04 '25 14:06 Ellie-Williams-007

I am having a similar issue with this, but with xpu.

When I run pip install --upgrade ipex-llm[xpu]==2.3.0b20250605 --pre --extra-index-url https://download.pytorch.org/whl/xpu I get this: Image The logs say something about the Python version, but I tested with 3.10 and I get the same thing (without the Python version part, of course) I tried adding --extra-index-url https://download.pytorch.org/whl/nightly/xpu, but that doesnt work either.

If I try to use the latest portable Ollama version I get the same error as thehoff above and as a plus, everytime I run that Ollama my RAM usage gets increase each time I chat.

I am using it on a B580

WizardlyBump17 avatar Jun 05 '25 16:06 WizardlyBump17

pip install --pre --upgrade 'ipex-llm[xpu_2.6]>=2.3.0b0,<2.3.0rc1' --extra-index-url https://download.pytorch.org/whl/xpu

I am having a similar issue with this, but with xpu.

When I run pip install --upgrade ipex-llm[xpu]==2.3.0b20250605 --pre --extra-index-url https://download.pytorch.org/whl/xpu I get this: Image The logs say something about the Python version, but I tested with 3.10 and I get the same thing (without the Python version part, of course) I tried adding --extra-index-url https://download.pytorch.org/whl/nightly/xpu, but that doesnt work either.

If I try to use the latest portable Ollama version I get the same error as thehoff above and as a plus, everytime I run that Ollama my RAM usage gets increase each time I chat.

I am using it on a B580

hi, you can use pip install --pre --upgrade 'ipex-llm[xpu_2.6]>=2.3.0b0,<2.3.0rc1' --extra-index-url https://download.pytorch.org/whl/xpu to install the xpu nightly build.

cyita avatar Jun 06 '25 02:06 cyita

pip install --pre --upgrade 'ipex-llm[xpu_2.6]>=2.3.0b0,<2.3.0rc1' --extra-index-url https://download.pytorch.org/whl/xpu

I am having a similar issue with this, but with xpu. When I run pip install --upgrade ipex-llm[xpu]==2.3.0b20250605 --pre --extra-index-url https://download.pytorch.org/whl/xpu I get this: Image The logs say something about the Python version, but I tested with 3.10 and I get the same thing (without the Python version part, of course) I tried adding --extra-index-url https://download.pytorch.org/whl/nightly/xpu, but that doesnt work either. If I try to use the latest portable Ollama version I get the same error as thehoff above and as a plus, everytime I run that Ollama my RAM usage gets increase each time I chat. I am using it on a B580

hi, you can use pip install --pre --upgrade 'ipex-llm[xpu_2.6]>=2.3.0b0,<2.3.0rc1' --extra-index-url https://download.pytorch.org/whl/xpu to install the xpu nightly build.

That works, thanks, but it didnt add the init-llama-cpp and init-ollama

WizardlyBump17 avatar Jun 08 '25 02:06 WizardlyBump17

hi, you can use pip install --pre --upgrade 'ipex-llm[xpu_2.6]>=2.3.0b0,<2.3.0rc1' --extra-index-url https://download.pytorch.org/whl/xpu to install the xpu nightly build.

That works, thanks, but it didnt add the init-llama-cpp and init-ollama

If you want to use ollama/llama.cpp, please install [cpp] w/ pip install --pre --upgrade ipex-llm[cpp].

cyita avatar Jun 11 '25 02:06 cyita