Instructions for Ollama Installation with IPEX on Multi-GPU Fail to Work with Arc GPU
Describe the bug
I am following the documentation from:
- https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_portable_zip_quickstart.md
How to reproduce Steps to reproduce the error:
- Download the Ollama portable zip.
- Attempt running as in the instructions.
Screenshots The following instruction is missing what needs to be ran to find the supported SYCL devices:
Environment information
Relevant GPU information:
Python 3.13.0
-----------------------------------------------------------------
Transformers is not installed.
-----------------------------------------------------------------
PyTorch is not installed.
-----------------------------------------------------------------
'pip' is not recognized as an internal or external command,
operable program or batch file.
ipex-llm is not installed
-----------------------------------------------------------------
IPEX is not installed properly.
-----------------------------------------------------------------
Total Memory: 31.927 GB
Chip 0 Memory: 8 GB | Speed: 2133 MHz
Chip 1 Memory: 8 GB | Speed: 2133 MHz
Chip 2 Memory: 8 GB | Speed: 2133 MHz
Chip 3 Memory: 8 GB | Speed: 2133 MHz
-----------------------------------------------------------------
CPU Manufacturer: AuthenticAMD
CPU MaxClockSpeed: 3801
CPU Name: AMD Ryzen 9 3900X 12-Core Processor
CPU NumberOfCores: 12
CPU NumberOfLogicalProcessors: 24
-----------------------------------------------------------------
GPU 0: Intel(R) Arc(TM) A770 Graphics Driver Version: 32.0.101.6737
-----------------------------------------------------------------
-----------------------------------------------------------------
...
OS Name: Microsoft Windows 11 Pro
...
'xpu-smi' is not recognized as an internal or external command,
operable program or batch file.
xpu-smi is not installed properly.
Additional context
Logs from running the .\start-ollama.bat (there is no option to select the Intel Arc GPU):
time=2025-04-30T17:27:50.331+02:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
time=2025-04-30T17:27:50.331+02:00 level=INFO source=gpu_windows.go:167 msg=packages count=1
time=2025-04-30T17:27:50.331+02:00 level=INFO source=gpu_windows.go:214 msg="" package=0 cores=12 efficiency=0 threads=24
time=2025-04-30T17:27:50.497+02:00 level=INFO source=gpu.go:319 msg="detected OS VRAM overhead" id=GPU-8876a550-ac1e-a380-98d9-a8ce5b9eadc8 library=cuda compute=7.5 driver=12.9 name="NVIDIA GeForce GTX 1660 SUPER" overhead="831.5 MiB"
time=2025-04-30T17:27:50.500+02:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-8876a550-ac1e-a380-98d9-a8ce5b9eadc8 library=cuda variant=v12 compute=7.5 driver=12.9 name="NVIDIA GeForce GTX 1660 SUPER" total="6.0 GiB" available="5.0 GiB"
The Find x sycl devices will show when you are running a model, you can try to run a model first.
Thank you very much for responding @qiuxin2012
When I run it, it shows that the selected GPU is CUDA 1660 Super so I suspect the model should work. I'll try that to make sure.
My guess was it wouldn't be using the Arc GPU.
Hi @Kaszanas, you will not see any intel gpu info until you loading a model.