Invalid URL when calling OpenRouter model (`prime-intellect/intellect-3`, `deepseek/deepseek-r1-0528`)
Description
Description
I’m getting a TypeError [ERR_INVALID_URL]: "undefined/chat/completions" cannot be parsed as a URL when trying to run inference against an OpenRouter-hosted model (prime-intellect/intellect-3, deepseek/deepseek-r1-0528). The failure occurs even though an OpenRouter API key is correctly configured and funded and /connect reports the provider as active. I am able to inference the default provided openrouter models.
The error suggests the client is not resolving the OpenRouter base URL, resulting in "undefined/chat/completions" being passed to the fetch call.
Config
Follows the opencode.ai docs
"provider": {
"openrouter": {
"models": {
"prime-intellect/intellect-3": {},
"deepseek/deepseek-r1-0528": {}
}
}
}
OpenCode version
1.0.134
Steps to reproduce
- Add the providers in the config above to the
opencode.json - Connect the openrouter provider with an API key
- Try to inference, for example
opencode run -m openrouter/prime-intellect/intellect-3 hello
Screenshot and/or share link
No response
Operating System
macOS Sequoia 15.5
Terminal
kitty
This issue might be a duplicate of existing issues. Please check:
- #195: Attempting to use LM Studio with OpenCode - Same exact error message "TypeError [ERR_INVALID_URL]: "undefined/chat/completions" cannot be parsed as URL" when trying to use a custom provider with baseURL configuration
The root cause appears to be related to how custom provider baseURL configurations are being resolved. Issue #195 also mentions using OpenAI-compatible providers with baseURL, which is similar to your OpenRouter setup.
Feel free to ignore if your specific case differs.
fixed in next release: https://github.com/sst/opencode/commit/c30b1130eea1d2317105b4b3e3f69cfeb5bdc6de
Amazing great. Thanks!