OllamaError: Unable to connect. Is the computer able to access the url?
[Not VPN/Proxy issue. Have tried on multiple devices.]
Description
Upon linking an Ollama API key, I am unable to send a message to any of the models provided that use Ollama.
OpenCode version
1.0.143
Steps to reproduce
- Connect Ollama API key.
- Attempt to message any model that uses Ollama as the provider.
Screenshot and/or share link
Operating System
Windows 11 (26220.7344)
Terminal
Powershell / Windows Terminal
This issue might be a duplicate of existing issues. Please check:
- #5187: Ollama: User message content arrives as empty array - model cannot see user input
- #4862: need help setting up ollama
- #2667: Remote Ollama stopped working
- #227: bug? Ollama does not work as a provider unless configured with an API key
Feel free to ignore if none of these address your specific case.
can u show me output of:
opencode debug config
can u show me output of:
opencode debug config
PS C:\Windows\System32> opencode debug config
{
"agent": {},
"mode": {},
"plugin": [],
"command": {},
"username": "Carson",
"keybinds": {
"leader": "ctrl+x",
"app_exit": "ctrl+c,ctrl+d,<leader>q",
"editor_open": "<leader>e",
"theme_list": "<leader>t",
"sidebar_toggle": "<leader>b",
"scrollbar_toggle": "none",
"username_toggle": "none",
"status_view": "<leader>s",
"session_export": "<leader>x",
"session_new": "<leader>n",
"session_list": "<leader>l",
"session_timeline": "<leader>g",
"session_share": "none",
"session_unshare": "none",
"session_interrupt": "escape",
"session_compact": "<leader>c",
"messages_page_up": "pageup",
"messages_page_down": "pagedown",
"messages_half_page_up": "ctrl+alt+u",
"messages_half_page_down": "ctrl+alt+d",
"messages_first": "ctrl+g,home",
"messages_last": "ctrl+alt+g,end",
"messages_last_user": "none",
"messages_copy": "<leader>y",
"messages_undo": "<leader>u",
"messages_redo": "<leader>r",
"messages_toggle_conceal": "<leader>h",
"tool_details": "none",
"model_list": "<leader>m",
"model_cycle_recent": "f2",
"model_cycle_recent_reverse": "shift+f2",
"command_list": "ctrl+p",
"agent_list": "<leader>a",
"agent_cycle": "tab",
"agent_cycle_reverse": "shift+tab",
"input_clear": "ctrl+c",
"input_paste": "ctrl+v",
"input_submit": "return",
"input_newline": "shift+return,ctrl+return,alt+return,ctrl+j",
"input_move_left": "left,ctrl+b",
"input_move_right": "right,ctrl+f",
"input_move_up": "up",
"input_move_down": "down",
"input_select_left": "shift+left",
"input_select_right": "shift+right",
"input_select_up": "shift+up",
"input_select_down": "shift+down",
"input_line_home": "ctrl+a",
"input_line_end": "ctrl+e",
"input_select_line_home": "ctrl+shift+a",
"input_select_line_end": "ctrl+shift+e",
"input_visual_line_home": "alt+a",
"input_visual_line_end": "alt+e",
"input_select_visual_line_home": "alt+shift+a",
"input_select_visual_line_end": "alt+shift+e",
"input_buffer_home": "home",
"input_buffer_end": "end",
"input_select_buffer_home": "shift+home",
"input_select_buffer_end": "shift+end",
"input_delete_line": "ctrl+shift+d",
"input_delete_to_line_end": "ctrl+k",
"input_delete_to_line_start": "ctrl+u",
"input_backspace": "backspace,shift+backspace",
"input_delete": "ctrl+d,delete,shift+delete",
"input_undo": "ctrl+-,super+z",
"input_redo": "ctrl+.,super+shift+z",
"input_word_forward": "alt+f,alt+right,ctrl+right",
"input_word_backward": "alt+b,alt+left,ctrl+left",
"input_select_word_forward": "alt+shift+f,alt+shift+right",
"input_select_word_backward": "alt+shift+b,alt+shift+left",
"input_delete_word_forward": "alt+d,alt+delete,ctrl+delete",
"input_delete_word_backward": "ctrl+w,ctrl+backspace,alt+backspace",
"history_previous": "up",
"history_next": "down",
"session_child_cycle": "<leader>right",
"session_child_cycle_reverse": "<leader>left",
"terminal_suspend": "ctrl+z"
}
}
@agustif I don't believe that https://github.com/agustif/opencode/issues/12 would resolve this, as I am not running the model locally. It should be using Ollama Cloud.
(Apologies if I misunderstood your issue description.)
@agustif I don't believe that agustif#12 would resolve this, as I am not running the model locally. It should be using Ollama Cloud.
(Apologies if I misunderstood your issue description.)
yep sorry about that, closed my PR!