Alpaca icon indicating copy to clipboard operation
Alpaca copied to clipboard

Alpaca cannot find my GPU

Open hmmistic opened this issue 8 months ago • 6 comments

Describe the bug Alpaca cannot find my GPU (NVIDIA RTX 3050), modules only run on CPU. Logs say "no compatible GPUs were discovered".

Expected behavior Alpaca should find my GPU and can run modules on the GPU

Screenshots Image

Debugging information

> flatpak run com.jeffser.Alpaca
INFO    [main.py | main] Alpaca version: 5.3.0
MESA-INTEL: warning: ../src/intel/vulkan/anv_formats.c:782: FINISHME: support YUV colorspace with DRM format modifiers
MESA-INTEL: warning: ../src/intel/vulkan/anv_formats.c:814: FINISHME: support more multi-planar formats with DRM modifiers
INFO    [instance_manager.py | start] Starting Alpaca's Ollama instance...
INFO    [instance_manager.py | start] Started Alpaca's Ollama instance
Couldn't find '/home/rashad/.ollama/id_ed25519'. Generating new private key.
INFO    [instance_manager.py | start] client version is 0.6.2
Your new public key is: 

ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICD3KdhXCcHE+/1RpEb6wlSueKsaHrLGpu0fVksbaken

2025/04/07 14:39:22 routes.go:1230: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/rashad/.var/app/com.jeffser.Alpaca/data/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-04-07T14:39:22.537+04:00 level=INFO source=images.go:432 msg="total blobs: 27"
time=2025-04-07T14:39:22.537+04:00 level=INFO source=images.go:439 msg="total unused blobs removed: 0"
time=2025-04-07T14:39:22.537+04:00 level=INFO source=routes.go:1297 msg="Listening on [::]:11435 (version 0.6.2)"
time=2025-04-07T14:39:22.537+04:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
time=2025-04-07T14:39:22.557+04:00 level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
time=2025-04-07T14:39:22.557+04:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="15.3 GiB" available="9.2 GiB"
[GIN] 2025/04/07 - 14:39:22 | 200 |     671.943µs |       127.0.0.1 | GET      "/api/tags"
[GIN] 2025/04/07 - 14:39:22 | 200 |    8.563036ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2025/04/07 - 14:39:22 | 200 |    14.78111ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2025/04/07 - 14:39:22 | 200 |   17.805069ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2025/04/07 - 14:39:22 | 200 |   18.562658ms |       127.0.0.1 | POST     "/api/show"
[GIN] 2025/04/07 - 14:39:22 | 200 |   21.776605ms |       127.0.0.1 | POST     "/api/show"

hmmistic avatar Apr 07 '25 10:04 hmmistic

I faced this on a dual GPU system and switched Nvidia Prime from On-Demand to Performance mode which fixed the issue. I did not have luck just starting Alpaca "on the GPU" as this lead to screen corruption.

rkraneis avatar Apr 11 '25 19:04 rkraneis

Just to chime in, I tried to override the default env-variables by forcing alpaca to find my GPU:

sudo flatpak override --env=CUDA_VISIBLE_DEVICES=GPU-70fca5f3-XXXX-XXXX-XXXX-XXXXXXXXXXXX com.jeffser.Alpaca && flatpak run com.jeffser.Alpaca

However, I still get the no compatible GPUs were discovered error.

KhaaL avatar Apr 23 '25 18:04 KhaaL

Faced this on another (AMD) system, too. There I just switched to up-to-date 'official' ollama and the issue went away. Did not investigate further.

rkraneis avatar May 15 '25 14:05 rkraneis

Oh, and I overlooked part of your comment above, @KhaaL. You cannot set some of the variables on the command line (I fell into the same trap) but have to set them in Alpaca's settings window.

rkraneis avatar May 15 '25 14:05 rkraneis

Confirming the issue on amd card. RX 6800 (gfx1030) is not being detected while using flatpak extensions (Ollama and AMD).

ArturSultanov avatar Jun 01 '25 18:06 ArturSultanov

Same issue on RX 6700 (gfx1030) as well.

Installed Alpaca, AMD extension and ollama instance all through flatpak.

Overrides setting as follow: Image


Edit

GPU is working and detected after updating to version 7.5 and changing to following parameters:

Image

Thanks for the work!

Sixth2538 avatar Jul 18 '25 10:07 Sixth2538