bug: Intel iGPU not desiplayed in Hardware/System Monitor
Jan version
0.6.1
Describe the Bug
IDK if it's expected to be there but on one machine I have the iGPU displayed (AMD), though with error when attempting to view the Hardware/System Monitor but on another machine (Intel), I don't see the iGPU listed at all.
Steps to Reproduce
No response
Screenshots / Logs
No response
What is your OS?
- [ ] MacOS
- [x] Windows
- [ ] Linux
hi @qnixsynapse @louis-menlo is it related to our lib to detect hardware?
Intel integrated GPUs are filtered out intentionally. Only AMD with Vulkan mode can run better in this case, which might not be covered. @gau-nernst, can you give some thoughts on this? To me, I expected not to show integrated GPUs here.
IIRC, currently cortex intentionally filters out non-AMD GPUs for Vulkan backend. When we move to llama.cpp extension, I think we can support Intel iGPUs via Vulkan.
On my side (as a user), gracious failure would be nice - or a little note, saying which iGPUs/GPUs are to be listed there in hardware.
llama.cpp supports Intel via SYCL backend, maybe you should add support for this backend? As I understand, Intel support via Vulcan will not be added to llama.cpp in the nearest future, and if you try to use Intel via Vulcan, the performance will be much lower. https://github.com/ggml-org/llama.cpp/issues/12690
Hi @nikiluk can I check if this is sovled for you on 0.6.6? We have moved fully to llama.cpp
I will close this now as it has been a while with no information, so I assume it is fixed.
Will open again if you have new information