Failed to establilsh a new connection
Describe the bug Just installed Alpaca from GNOME software (flatpak), tried to download a model and the following error arised:
HTTPConnectionPool(host='0.0.0.0', port=11435): Max retries exceeded with url: /api/pull (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fe9dc33fa40>: Failed to establish a new connection: [Errno 111] Connection refused'))
Expected behavior The model downloads properly
Screenshots
Debugging information Please include the output of Alpaca, for this you'll need to run Alpaca from the terminal, then try to reproduce the error you want to report.
Yikes, apparently it worked after reopening the app, seems like it was a first-time kind of issue, idk. It happened with every model I tried to install though.
➜ manuel ~ flatpak run com.jeffser.Alpaca
INFO [main.py | main] Alpaca version: 5.0.5
MESA-INTEL: warning: ../src/intel/vulkan/anv_formats.c:782: FINISHME: support YUV colorspace with DRM format modifiers
MESA-INTEL: warning: ../src/intel/vulkan/anv_formats.c:814: FINISHME: support more multi-planar formats with DRM modifiers
INFO [instance_manager.py | start] Starting Alpaca's Ollama instance...
INFO [instance_manager.py | start] Started Alpaca's Ollama instance
Couldn't find '/home/manuel/.ollama/id_ed25519'. Generating new private key.
Your new public key is:
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINl947pzSVftB46LRVr/nUb+5Ci91EgRqImCpbfVO/1W
2025/03/07 13:07:18 routes.go:1205: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/manuel/.var/app/com.jeffser.Alpaca/data/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-03-07T13:07:18.521-06:00 level=INFO source=images.go:432 msg="total blobs: 0"
time=2025-03-07T13:07:18.522-06:00 level=INFO source=images.go:439 msg="total unused blobs removed: 0"
time=2025-03-07T13:07:18.522-06:00 level=INFO source=routes.go:1256 msg="Listening on [::]:11435 (version 0.5.12)"
time=2025-03-07T13:07:18.522-06:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
INFO [instance_manager.py | start] client version is 0.5.12
time=2025-03-07T13:07:18.528-06:00 level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
time=2025-03-07T13:07:18.528-06:00 level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="15.4 GiB" available="9.4 GiB"
[GIN] 2025/03/07 - 13:07:18 | 200 | 881.66µs | 127.0.0.1 | GET "/api/tags"
time=2025-03-07T13:07:40.114-06:00 level=INFO source=download.go:176 msg="downloading 74701a8c35f6 in 14 100 MB part(s)"
time=2025-03-07T13:10:01.507-06:00 level=INFO source=download.go:176 msg="downloading 966de95ca8a6 in 1 1.4 KB part(s)"
time=2025-03-07T13:10:02.811-06:00 level=INFO source=download.go:176 msg="downloading fcc5a6bec9da in 1 7.7 KB part(s)"
time=2025-03-07T13:10:04.077-06:00 level=INFO source=download.go:176 msg="downloading a70ff7e570d9 in 1 6.0 KB part(s)"
time=2025-03-07T13:10:05.369-06:00 level=INFO source=download.go:176 msg="downloading 4f659a1e86d7 in 1 485 B part(s)"
[GIN] 2025/03/07 - 13:10:10 | 200 | 2m31s | 127.0.0.1 | POST "/api/pull"
[GIN] 2025/03/07 - 13:10:10 | 200 | 30.474339ms | 127.0.0.1 | POST "/api/show"
/usr/lib/python3.12/site-packages/gi/overrides/Gio.py:42: Warning: g_value_get_int: assertion 'G_VALUE_HOLDS_INT (value)' failed
return Gio.Application.run(self, *args, **kwargs)
time=2025-03-07T13:14:42.618-06:00 level=INFO source=download.go:176 msg="downloading 59bb50d8116b in 16 239 MB part(s)"
System specs OS: Fedora 41 Model: Thinkpad t470p CPU: Intel i7-7700HQ GPU: Intel HD Graphics 630 GPU: NVIDIA GeForce 940mx RAM: 16GB
Thanks for the error report. To make this as easy as possible, I've a few questions:
- Did the error appear immediately after starting the download?
- If yes, have you somehow changed the Flatpak's permissions using an app like Flatseal beforehand?
- Have you been able to reproduce it since it stopped happening?
- Yes, it appeared immediately after clicking the download button.
- I considered changing permissions with flatseal indeed, but didn't see any relevant permission off, so I ended up not modifying them.
- Nope, haven't been able to reproduce
Hmm, okay, because it seems like the internal Ollama instance might have crashed due to the connection on its port being refused. This may have a lot of reasons, like the operating system killing it if the RAM gets too full, a bug inside of Ollama and so on.
I'd think it's best to leave this issue open for now and see if someone else has that problem as well - at least it doesn't seem like it's a critical bug or anything if you haven't been able to reproduce it anymore.
I got this error too I got it after adding the Olama plugin, but I got it on download and the next start it went away after an app restart and update
Yeah I believe this was fixed with the newest update
I have the same error on TuxedoOS (64 GB RAM). I installed Alpaca via Discover Software Center.
Then I installed the Ollama plugin via terminal: flatpak install com.jeffser.Alpaca.Plugins.Ollama.
When I start Alpaca then I got the error mentioned on the top by @menuRivera . I cannot choose between available and installed models, etc.
The /var/log/syslog gives me the following output when starting Alpaca from the apps:
2025-03-22T23:16:40.077263+01:00 username systemd[1703]: Started app-flatpak-com.jeffser.Alpaca-5503.scope.
2025-03-22T23:16:40.800676+01:00 username flatpak[5514]: INFO#011[main.py | main] Alpaca version: 5.3.0
2025-03-22T23:16:40.817998+01:00 username xdg-desktop-portal-kde[1979]: xdp-kde-settings: Namespace "org.gnome.desktop.a11y.interface" is not supported
2025-03-22T23:16:40.819126+01:00 username xdg-desktop-portal-kde[1979]: xdp-kde-settings: Namespace "org.gnome.desktop.interface" is not supported
2025-03-22T23:16:40.819810+01:00 username xdg-desktop-portal-kde[1979]: xdp-kde-settings: Namespace "org.gnome.desktop.interface" is not supported
2025-03-22T23:16:40.963319+01:00 username flatpak[5514]: ERROR#011[instance_manager.py | get_local_models] HTTPConnectionPool(host='0.0.0.0', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x723d9123a690>: Failed to establish a new connection: [Errno 111] Connection refused'))
The /var/log/syslog gives me the following output when starting Alpaca from the terminal:
2025-03-22T23:17:50.551048+01:00 username systemd[1703]: Started app-flatpak-com.jeffser.Alpaca-5956.scope.
2025-03-22T23:17:51.302513+01:00 username xdg-desktop-portal-kde[1979]: xdp-kde-settings: Namespace "org.gnome.desktop.a11y.interface" is not supported
2025-03-22T23:17:51.303408+01:00 username xdg-desktop-portal-kde[1979]: xdp-kde-settings: Namespace "org.gnome.desktop.interface" is not supported
2025-03-22T23:17:51.304089+01:00 username xdg-desktop-portal-kde[1979]: xdp-kde-settings: Namespace "org.gnome.desktop.interface" is not supported
I deleted the instances and added Ollama (Managed) and then increased the port by one. And now I do not get the error and have the possibility to download a model from available models.
@platomat's solution of changing the Ollama (Managed) port resolved the issue for me with the Flatpak on Fedora 42.
Okay, so for everyone who has that problem, the solution is:
- Remove the Ollama (Managed) instance,
- Add it again,
- Increase its port by one,
- ???
- Profit
I wonder why this is only happening to some people. Quite a weird thing to happen, honestly.