Ollama models not appearing in my model selection menu. Ollama setup trouble.
Question
This time around when setting up OpenCoder with Ollama to use offline, I am having a lot of trouble. This is what my configuration file looks like.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (PC1)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"ministral-3:3b": {
"name": "ministral-3:3b"
}
}
}
}
}
Whenever I use the /models it does not give me an option for my Ollama models.
I am currently using Linux Debbie 13 and I am using opencode Version 1.0.134. Ollama Version 0.13.1
This issue might be a duplicate of existing issues. Please check:
- #4862: Ollama provider initialization fails with similar configuration setup (ProviderInitError with TypeError)
- #4428: Similar issues with Ollama models not working despite correct configuration
- #227: Ollama requires API key configuration even for localhost setup
Feel free to ignore if none of these address your specific case.
No, I can't get my Ollama models to come up at all.
My Ollama models do not even appear as an option.
I was testing to see if it was just the smaller models that have the problem but I'm still having the same problem with the larger models as well.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (local)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"qwen3-vl:235b": {
"name": "qwen3-vl:235b"
}
}
}
}
}
I suspect the file isn't in the correct location.
Can you run:
opencode debug config
{
"agent": {},
"mode": {},
"plugin": [],
"command": {},
"username": "misty",
"keybinds": {
"leader": "ctrl+x",
"app_exit": "ctrl+c,ctrl+d,<leader>q",
"editor_open": "<leader>e",
"theme_list": "<leader>t",
"sidebar_toggle": "<leader>b",
"username_toggle": "none",
"status_view": "<leader>s",
"session_export": "<leader>x",
"session_new": "<leader>n",
"session_list": "<leader>l",
"session_timeline": "<leader>g",
"session_share": "none",
"session_unshare": "none",
"session_interrupt": "escape",
"session_compact": "<leader>c",
"messages_page_up": "pageup",
"messages_page_down": "pagedown",
"messages_half_page_up": "ctrl+alt+u",
"messages_half_page_down": "ctrl+alt+d",
"messages_first": "ctrl+g,home",
"messages_last": "ctrl+alt+g,end",
"messages_last_user": "none",
"messages_copy": "<leader>y",
"messages_undo": "<leader>u",
"messages_redo": "<leader>r",
"messages_toggle_conceal": "<leader>h",
"tool_details": "none",
"model_list": "<leader>m",
"model_cycle_recent": "f2",
"model_cycle_recent_reverse": "shift+f2",
"command_list": "ctrl+p",
"agent_list": "<leader>a",
"agent_cycle": "tab",
"agent_cycle_reverse": "shift+tab",
"input_clear": "ctrl+c",
"input_forward_delete": "ctrl+d",
"input_paste": "ctrl+v",
"input_submit": "return",
"input_newline": "shift+return,ctrl+j",
"history_previous": "up",
"history_next": "down",
"session_child_cycle": "<leader>right",
"session_child_cycle_reverse": "<leader>left",
"terminal_suspend": "ctrl+z"
}
}
I am accessing the file by using Nano in the command line in my Debian operating system.
nano ~/.local/share/opencode/auth.json
The Nano output is the following.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (local)",
"options": {
"baseURL": "http://localhost:11434/"
},
"models": {
"qwen3-vl:235b": {
"name": "qwen3-vl:235b"
}
}
}
}
}
- Unfortunately, even with the file provided, I'm still getting this error.
Config file at /home/qwerty/.opencode/opencode.json is not valid JSON(C):
--- JSONC Input ---
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (local)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama3": {
"name": "Llama 3"
},
"kimi-k2-thinking:cloud": {
"name": "kimi k2"
},
"qwen3-v1:latest": {
"name": "qwen3"
},
"gpt-oss:latest": {
"name": "gtp-oss"
},
"llama3:latest": {
"name": "llama3"
},
"devstral-small-2:latest": {
"name": "devstral"
}
}
}
}
--- Errors ---
CloseBraceExpected at line 32, column 1
--- End ---
- I have updated the file and added one extra missing bracket.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (local)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama3": {
"name": "Llama 3"
},
"kimi-k2-thinking:cloud": {
"name": "kimi k2"
},
"qwen3-v1:latest": {
"name": "qwen3"
},
"gpt-oss:latest": {
"name": "gtp-oss"
},
"llama3:latest": {
"name": "llama3"
},
"devstral-small-2:latest": {
"name": "devstral"
}
}
}
}
}
It is working now. Thank you.