Unable to connect to ollama. Please, check that ollama is running.
When I go to http://127.0.0.1:11434/
I see: Ollama is running
Perhaps a recent Ollama update broke the extension?
❯ ollama --version
ollama version is 0.3.11
Try to close/quit ollama and open it again. I have faced that problem on some few occasions.
Hey @anjerodev,
I’m experiencing the same issue as well. It seems that even though Ollama is running, the connection breaks when attempting to interact with it. I tried restarting Ollama, but the issue persists.
Here's the debug log I managed to gather while reproducing the issue:
OLLAMA_DEBUG=1 ollama serve
2024/10/02 13:25:48 routes.go:1153: INFO server config env="map[HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/Users/loerus/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: http proxy: https proxy: no proxy:]"
time=2024-10-02T13:25:48.512+02:00 level=INFO source=images.go:753 msg="total blobs: 32"
time=2024-10-02T13:25:48.514+02:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-10-02T13:25:48.515+02:00 level=INFO source=routes.go:1200 msg="Listening on 127.0.0.1:11434 (version 0.3.12)"
time=2024-10-02T13:25:48.515+02:00 level=INFO source=common.go:135 msg="extracting embedded files" dir=/var/folders/qz/h10bkb996sn_h4m9x4p0n44r0000gn/T/ollama1851176616/runners
[GIN] 2024/10/02 - 13:28:13 | 400 | 303.042µs | 127.0.0.1 | POST "/api/chat"
I observed a 400 request when trying to interact with Ollama. My Ollama version is 0.3.12; hopefully, that helps track down the issue.
Thanks for creating this awesome extension!
Yeah I also tried restarting, even rebooted my machine the problem persists I know that Ollama was updated on my machine recently so I assume it has something to do with this?
@TheDeveloper @lentrup Hi! Could you share more details, such as your machine’s OS (Windows, macOS) and your Commitollama configuration? I’m also on Ollama v0.3.12, and I’m not encountering that error.
@anjerodev
Hi, thanks for your help! Here's my setup:
Machine Details:
Model: MacBook Pro Chip: Apple M1 Pro macOS Version: 14.6.1
When I ran OLLAMA_DEBUG=1 ollama serve, I noticed the following:
[GIN] 2024/10/03 - 11:31:22 | 200 | 4.422231542s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/10/03 - 11:32:26 | 400 | 1.870708ms | 127.0.0.1 | POST "/api/chat"
The first request was from using Open WebUI, and I received a 200 response. However, when trying to use Commitollama via VSCode, I get a 400 error every time.
I also tried downgrading Commitollama through a few previous versions, but the problem persists. It seems this issue isn’t tied to a specific version of Commitollama.
You can view my current Commitollama configuration here:
Thanks again for looking into this!
@TheDeveloper @lentrup Hi! Could you share more details, such as your machine’s OS (Windows, macOS) and your Commitollama configuration? I’m also on Ollama v0.3.12, and I’m not encountering that error.
Sure, I'm running Windows 11. The settings were working fine, I didn't make any changes but Ollama did update and it stopped working after that.
Here are my settings:
"commitollama.useDescription": true,
"commitollama.custom.model": "llama3",
"commitollama.model": "llama3"
Is this still a problem? I have made several attempts but am unable to reproduce the issue.
No it seems to be working now. I'm not sure if it was an update that fixed it.
OK, thank you for letting me know. It did drive me a little crazy when I couldn’t figure out what the actual problem was. I’ll close the issue then.