opencode
opencode copied to clipboard
Nemotron seems to ignore my request when using ollama
Description
I am trying to use a local LLM (nemotron-3-nano:30b) via ollama to run opencode. When I send a message in plan mode, I get a response stating that the model is in plan mode but has not received a request. I get a similar response in build mode (but sometimes i get tool calls that error out in the response.
OpenCode version
1.0.206
Steps to reproduce
- Install ollama
- Install opencode via npm
- Create a new folder and create an opencode.json file
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://127.0.0.1:11434/v1",
},
"models": {
"nemotron-3-nano:30b": {}
}
}
}
}
- Open a terminal in the current directory and start opencode, select the correct model and try to send a message
Screenshot and/or share link
https://opncd.ai/share/b5WLRobS
Operating System
Windows 11
Terminal
Windows Terminal