Agent Panel: gemini API error through litellm
Summary
When using the Gemini 2.5 Pro API through litellm proxy, receive the error below.
Error interacting with language model data did not match any variant of untagged enum ResponseStreamResult
Description
Steps to trigger the problem:
- Install Zed 0.185.11
- Setup litellm instance with models from OpenAI, Anthropic and Gemini. Litellm version:
ghcr.io/berriai/litellm:main-v1.68.1-nightly - Configure zed settings to use litellm. See below.
Settings
"openai": {
"version": "1",
// "api_url": "https://api.openai.com/v1"
"api_url": "https://litellm.internal.com",
"available_models": [
{
"name": "claude/claude-3-7-sonnet-20250219",
"display_name": "litellm/claude-3-7-sonnet-20250219",
"max_tokens": 200000
},
{
"name": "gemini/gemini-2.5-pro",
"display_name": "litellm/gemini-2.5-pro",
"max_tokens": 1000000
},
{
"name": "openai/o4-mini",
"display_name": "litellm/o4-mini",
"max_tokens": 200000
}
]
},
Actual Behavior:
Zed Editor error: Error interacting with language model data did not match any variant of untagged enum ResponseStreamResult
LiteLLM error:
litellm.BadRequestError: VertexAIException BadRequestError - b'{\n "error": {\n "code": 400,\n "message": "Invalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[1].parameters.properties[0].value\': Proto field is not repeating, cannot start list.\\nInvalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[3].parameters.properties[0].value\': Proto field is not repeating, cannot start list.\\nInvalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[3].parameters.properties[2].value\': Proto field is not repeating, cannot start list.\\nInvalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[8].parameters.properties[1].value\': Proto field is not repeating, cannot start list.",\n "status": "INVALID_ARGUMENT",\n "details": [\n {\n "@type": "type.googleapis.com/google.rpc.BadRequest",\n "fieldViolations": [\n {\n "field": "tools[0].function_declarations[1].parameters.properties[0].value",\n "description": "Invalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[1].parameters.properties[0].value\': Proto field is not repeating, cannot start list."\n },\n {\n "field": "tools[0].function_declarations[3].parameters.properties[0].value",\n "description": "Invalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[3].parameters.properties[0].value\': Proto field is not repeating, cannot start list."\n },\n {\n "field": "tools[0].function_declarations[3].parameters.properties[2].value",\n "description": "Invalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[3].parameters.properties[2].value\': Proto field is not repeating, cannot start list."\n },\n {\n "field": "tools[0].function_declarations[8].parameters.properties[1].value",\n "description": "Invalid JSON payload received. Unknown name \\"type\\" at \'tools[0].function_declarations[8].parameters.properties[1].value\': Proto field is not repeating, cannot start list."\n }\n ]\n }\n ]\n }\n}\n'
Expected Behavior: For the LLM API to work like it does for OpenAI and Anthropic providers.
Zed Version and System Specs
Zed: v0.185.11 (Zed) OS: macOS 15.4.1 Memory: 24 GiB Architecture: aarch64
Same problem here.
Zed uses schemars to converts tool inputs to json values in requests. Default behavior of schemars will convert Option parameters (like in DiagnosticsToolInput) to [string, null] in the JSON.
Not sure if this is a VertexAI validation issue since using other models like o3, o3-mini through LiteLLM proxy server works fine.
Happening with all openrouter models as well for me.
Don't think it's a zed issue. It's mostly an upstream issue as they are not cleaning up openai tool schema for gemini models as gemini doesn't support all tool fields. You guys should raise this in with respective upstream provider. As the openai schema used for openai very tightly bind so this can't be fixed from zed and shouldn't be fixed.
@imumesh18 I see, thanks for letting us know. Can you be more specific about what the mismatch is so we can open good bug reports at the provider repos?
I got the problem as well when using OpenAI API Compatible with
"language_models": {
"openai": {
"api_url": "https://api.chataiapi.com/v1",
"version": "1",
"available_models": [
{
"name": "gemini-2.5-pro-preview-05-06",
"display_name": "Gemini 2.5 Pro 05-06",
"max_tokens": 200000
}
]
}
},
The error giving me:
Error interacting with language model
Failed to connect to OpenAI API: Invalid JSON payload received. Unknown name "type" at 'tools[0].function_declarations[7].parameters.properties[0].value': Proto field is not repeating, cannot start list.
Invalid JSON payload received. Unknown name "type" at 'tools[0].function_declarations[9].parameters.properties[0].value': Proto field is not repeating, cannot start list.
Invalid JSON payload received. Unknown name "type" at 'tools[0].function_declarations[9].parameters.properties[2].value': Proto field is not repeating, cannot start list.
Invalid JSON payload received. Unknown name "type" at 'tools[0].function_declarations[11].parameters.properties[1].value': Proto field is not repeating, cannot start list.
Invalid JSON payload received. Unknown name "definitions" at 'tools[0].function_declarations[13].parameters': Cannot find field.
Invalid JSON payload received. Unknown name "$ref" at 'tools[0].function_declarations[13].parameters.properties[0].value.all_of[0]': Cannot find field. (request id: 20250509140236565676537UIfuFZJX) (request id: 20250509140236233948494XJ66n4Wb)
Zed version: 0.185.13 OS: Mac OS 15.3.1
Following, getting similar errors with litellm/zed. Also unable to call tools using the models the models that do work through litellm/zed.
The same errors for me when trying to use llam.cpp server api, openrouter, litellm for tool use. If I drop some additional parameters "tools, tool_choice" by letting litellm proxy to filter them, the errors disapear. However, you cannot use MCP tools.
For litellm config.yaml: litellm_params: model: "openrouter/deepseek/deepseek-chat-v3-0324:free" api_key: "os.environ/OPENROUTER_API_KEY" additional_drop_params: ["tools", "tool_choice"]
For setting.json of zed:
"openai": {
"version": "1",
"api_url": "http://localhost:4000/v1",
"available_models": [
{
"name": "openrouter-deepseek-v3",
"display_name": "deepseek-v3",
"max_tokens": 128000,
"max_output_tokens": 65000,
"drop_params": true,
"response_format": { "type": "json_object" }
}]
same here, using litellm vertex_ai provider with gemini { "name": "gemini-2.5-pro-preview-05-06", "display_name": "gemini-2.5-pro-preview-05-06", "max_tokens": 10000000 },
Same here for anthrophic API (using langdock.com)
{
"version": "1",
"api_url": "https://api.langdock.com/anthropic/eu/",
"available_models": [
{
"name": "claude-3-7-sonnet-20250219",
"display_name": "(Langdock) Claude 3.7-Sonnet",
"max_tokens": 128000,
"max_output_tokens": 64000
}
]
}
receiving
Failed to connect to API: 400 Bad Request {"message":[{"code":"invalid_union","unionErrors":[{"issues":[{"received":"none","code":"invalid_literal","expected":"auto","path":["tool_choice","type"],"message":"Invalid literal value, expected \"auto\""}],"name":"ZodError"},{"issues":[{"received":"none","code":"invalid_literal","expected":"any","path":["tool_choice","type"],"message":"Invalid literal value, expected \"any\""}],"name":"ZodError"},{"issues":[{"received":"none","code":"invalid_literal","expected":"tool","path":["tool_choice","type"],"message":"Invalid literal value, expected \"tool\""},{"code":"invalid_type","expected":"string","received":"undefined","path":["tool_choice","name"],"message":"Required"}],"name":"ZodError"}],"path":["tool_choice"],"message":"Invalid input"}]}
which indicates that for some reason the required tool_choice parameter is set to none instead of auto or any.
Version: Zed 0.186.11
Is there any tool to let us know what attributes need to be added/removed from a response to make it adhere to the strict schema zed needs! Since people are gonna want to run Zed against many long tail models, I think it can be reasonable to allow some flexibility in LLM responses and schemas
Google does not support full JSON Schema for tool parameters (yet). When using the Google AI provider we try to make the Schema compatible with what Google supports https://github.com/zed- industries/zed/blob/8ec69afab17057eb94719b33e1a5c9ca2dac4261/crates/assistant_tool/src/tool_schema.rs#L9
Currently we only apply this Transformation when using the Google AI provider, but we might need to change this so that this is applied always when using Gemini models.
Not sure how we would detect this in a robust way. A short term fix could be to check if the model name contains gemini.
What it's worth I believe this issue only occur with openai compatible api and custom models. Probably we can introduce a new field in model as vendor which defines the underlying model's actual provider and base the logic out of it. That said I still feel litellm should handle removing unsupported params from tool schema like rest of the openai multi vendor provider. There is already a PR: https://github.com/BerriAI/litellm/pull/11539 hopefully this should be fixed by that.
it seems like this is only the now tool and the edit_file tool for me. I don't care so much about "now", but edit_file is obviously kinda important for a file editing agent 😆. But if I turn just those tools off, I don't get errors at least not outright. with those enabled I can't send any text to litellm at all.
edit: this is with an anthropic model, through openai compatible config to litellm. not gemini. but this didn't used to happen, and it's only those two tools. So I wonder if it is a zed problem..
Any progress on this?? This is pretty much preventing me from being able to use the Agent panel in the editor for any write tasks.