Agent Panel: Invalid Model Name Error in LiteLLM Proxy
Summary
While using the LiteLLM Proxy, an error occurred due to an invalid model name being passed. The error message indicates that the model gpt-4.1-mini is used.
Description
Steps to trigger the problem:
- Start litellm proxy server
- Zed editor write mode
- Add URL to Zed agent (https://zed.dev/blog/fastest-ai-code-editor)
- Summarize Snippets from Zed config:
// --- AI / Assistant Settings ---
// Language model configurations
"language_models": {
// LiteLLM proxy as the OpenAI endpoint
"openai": {
"api_url": "http://127.0.0.1:4000/v1",
"version": "1",
"available_models": [
{
"name": "codestral-latest",
"display_name": "Codestral-Latest (via LiteLLM)",
"max_tokens": 128000
}
]
},
// Google provider for Gemini models
"google": {
"available_models": [
{
"name": "gemini-2.0-flash",
"display_name": "Gemini 2.0 Flash (Latest)",
"max_tokens": 1000000
}
]
}
},
// Assistant behavior configurations
"agent": {
"version": "2",
// General "Ask the Assistant" & code edits via Codestral
"default_model": {
"provider": "openai",
"model": "codestral-latest"
},
// Default: Inline suggestions via Codestral
"inline_assistant_model": {
"provider": "openai",
"model": "codestral-latest"
},
// Default: Commit-message drafts via Codestral
"commit_message_model": {
"provider": "openai",
"model": "codestral-latest"
},
// Alternative models for each task
"default_alternatives": [
{
"provider": "google",
"model": "gemini-2.0-flash"
}
],
"inline_alternatives": [
{
"provider": "google",
"model": "gemini-2.0-flash"
}
],
"commit_message_alternatives": [
{
"provider": "google",
"model": "gemini-2.0-flash"
}
]
}
Actual Behavior: The LiteLLM Proxy returns a 400 Bad Request error with the message indicating an invalid model name. But output is correct in agent window. Expected Behavior: The LiteLLM Proxy should handle the request without errors if the model name is valid.
Zed Version and System Specs
Zed: v0.185.9 (Zed) OS: Linux Wayland fedora 42 Architecture: x86_64
I believe api_url parameter is not respected anymore, so it's directly pointing to openai servers. (I'm using another proxy with same kind of errors)
It is using/respecting api_url parameter - LiteLLM proxy is working and also full output (response). Therefore strange error message from LiteLLM proxy log.
It seems to ignore api_url for me, too.
After following https://github.com/zed-industries/zed/issues/27326, it works with a liteLLM proxy (e.g. forwarding to Claude 3.7 Sonnet).