Unable to use OpenRouter models when extensions are enabled
Describe the bug
When I try to use OpenRouter models, I get Request failed with status: 404 Not Found errors, but only if I enable extensions. If I disable these it works. For some reason the Jetbrains extensions doesn't have this problem.
To Reproduce
- Open Goose desktop
- Configure OpenRouter with an API key
- Select any model from OpenRouter (problem doesn't occur with Google or groq models)
- Turn on any extension (although, for some reason, the "Jetbrains" extension works fine?)
- Chat
- Get:
Ran into this error: Request failed: Request failed with status: 404 Not Found. Please retry if you think this is a transient or recoverable error. - Turn off all extensions
- Works fine!
Screenshots
Please provide following information:
- OS & Arch: MacOS latest m1
- Interface: UI
- Version: v1.0.5
- Extensions enabled: If I disable them all, it works.
- Provider & Model: Tried a bunch
Same here but disabling extension still gives me same error with OpenRouter or local ollama.
@joepio can you list the model you used with openrouter?
@Elite can you provide the step to reproduce it? Also which version did you use?
Thanks
1). Installed Goose v 1.0.7
2). Configured the OpenRouter and Ollama configurations.
3). No matter if I choose the OpenRouter or Ollama options the same error is received.
@Elite can you make sure you have the right model name? By running ollama list?
@Elite can you make sure you have the right model name? By running
ollama list?
Irrespective of right/wrong model selection what gives the error on OpenRouter API ? Add to that all I am asked by goose is the OLLAMA_HOST value !!
UPDATE:
I added the model listed by ollama list manually and it seemed to be accessing the host now. Not sure why OpenRouter give the error?
Update 2 : OK, so selecting the model google/gemini-2.0-pro-exp-02-05:free fixes the issue on OpenRouter.
Hmm, I tried google/gemini-2.0-pro-exp-02-05:free on openrouter but it just doesn't reply anything:
Other models (like mistralai/mistral-nemo:free and cognitivecomputations/dolphin3.0-r1-mistral-24b:free) failed.
The anthropic/claude-3.5-sonnet model at OpenRouter, however, does work.
I'll update OP
@Elite can you make sure you have the right model name? By running
ollama list?Irrespective of right/wrong model selection what gives the error on
OpenRouterAPI ? Add to that all I am asked bygooseis theOLLAMA_HOSTvalue !!UPDATE: I added the model listed by
ollama listmanually and it seemed to be accessing the host now. Not sure whyOpenRoutergive the error?Update 2 : OK, so selecting the model
google/gemini-2.0-pro-exp-02-05:freefixes the issue on OpenRouter.
my bad, I thought you were using ollama, glad to hear selecting the right model fixes it
@joepio can you restart the goose? Also can you check the openrouter activities page to see the raw metadata?
Thanks
@joepio I verified the log, these models don't support tool use:
2025-02-19T17:42:58.387534Z ERROR goose::agents::truncate: Error: Request failed: Object {"error": Object {"code": Number(404), "message": String("No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing")}}
I checked in the https://github.com/block/goose/pull/1293 to expose the error message
Close it now, if you have any other questions, feel free to reopen it