AI_APICallError: litellm.BadRequestError: AzureException BadRequestError - { "error": { "message": "Missing required parameter: 'tools[0].name'.", ...
Description
My config:
"provider": {
"myprovider": {
"npm": "@ai-sdk/openai-compatible",
"name": "myprovider",
"options": {
"baseURL": "myproviderapi.example.com/v1",
"apiKey": "{env:myproviderGPT_OPENAI_API_KEY}",
},
"models": {
"gpt-5": {
"name": "gpt-5",
},
"gpt-5-mini": {
"name": "gpt-5-mini",
},
"gpt-5-codex": {
"name": "gpt-5-codex",
},
"gpt-5-nano": {
"name": "gpt-5-nano",
},
"gpt-4.1": {
"name": "gpt-4.1",
},
"gpt-4.1-mini": {
"name": "gpt-4.1-mini",
},
"gpt-4o": {
"name": "gpt-4o",
},
"gpt-4o-mini": {
"name": "gpt-4o-mini",
},
"o4-mini": {
"name": "o4-mini",
},
"o3": {
"name": "o3",
},
"o3-mini": {
"name": "o3-mini",
},
"o1": {
"name": "o1",
},
},
},
},
When trying to use myprovider/gpt-5-codex, I get the following error:
AI_APICallError: litellm.BadRequestError: AzureException BadRequestError - {
"error": {
"message": "Missing required parameter: 'tools[0].name'.",
"type": "invalid_request_error",
"param": "tools[0].name",
"code": "missing_required_parameter"
}
}. Received Model Group=gpt-5-codex
Available Model Group Fallbacks=None
OpenCode version
v0.15.8
Steps to reproduce
No response
Screenshot and/or share link
No response
Operating System
macOS 15.6.1
Terminal
Ghostty
This issue might be a duplicate of existing issues. Please check:
- #3044: Similar Azure GPT-5 Codex error with tool calling after several minutes of usage
- #2915: LiteLLM tool calling error with Anthropic models requiring tools parameter
- #3245: MCP server tool validation error with similar tools.name pattern matching issues
- #210: Tool usage failures with OpenAI compatible providers despite working fine with direct subscriptions
Feel free to ignore if none of these address your specific case.
Are you using any mcp servers?
Are you using any mcp servers?
Yes
@tnthi115 what happens if u remove them does it work without them?
@tnthi115 what happens if u remove them does it work without them?
I commented out my mcp config, but I got the same error.
@tnthi115 and u restarted opencode after updating the config?
@rekram1-node Yes I restarted it after updating the config. To confirm, I even asked a working model to use an the gitlab mcp server and it didn't work.
So what;s your setup is this a custom provider -> litellm -> azure?
Yes, litellm gateway and azure openai instance. I also have the same error as https://github.com/sst/opencode/issues/2387 and the same setup as one of the ones mentioned in that issue.
I can't view that link you sent, is it relevant to debugging? Can u maybe provide a screenshot or something
@rekram1-node Sorry, posted the wrong link. I just updated it
And do you have any custom tools? I haven't seen anyone else have this issue with litellm so I tend to think it is specific to your setup
does this happen on every request or is it only on compacts
I have the same issue. It works when I turn off all Tools in Continue configuration or remove "tool_use" from model config
Can either of you try this:
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"myprovider": {
"npm": "@ai-sdk/openai-compatible",
"name": "myprovider",
"options": {
// "baseURL": "myproviderapi.example.com/v1"
// Go to webhook.site and paste your unique url here:
"baseURL": ""
// comment out to avoid leaking secrets (or delete to be safe)
// "apiKey": "{env:myproviderGPT_OPENAI_API_KEY}"
},
"models": {
"gpt-5": {
"name": "gpt-5"
},
....
}
}
}
}
If you set the url to this proxy, then open opencode, send a message to any model under that provider, then go to the webhook site, grab the request body that is NOT for title gen (so the rlly big one with tools) and then show it here, it should help me answer which tool / thing is causing the issue
I encounter the exact same issue with azure in litellm. This seems to happens only when tagging a file.
The webhooks shows all functions has names.
However I notice this seems to be working with gpt-5.1 but not for gpt-5.1-codex