zed icon indicating copy to clipboard operation
zed copied to clipboard

Agent Panel: Invalid Model Name Error in LiteLLM Proxy

Open bengtfrost opened this issue 7 months ago • 4 comments

Summary

While using the LiteLLM Proxy, an error occurred due to an invalid model name being passed. The error message indicates that the model gpt-4.1-mini is used.

Description

Steps to trigger the problem:

  1. Start litellm proxy server
  2. Zed editor write mode
  3. Add URL to Zed agent (https://zed.dev/blog/fastest-ai-code-editor)
  4. Summarize Snippets from Zed config:
 // --- AI / Assistant Settings ---

  // Language model configurations
  "language_models": {
    // LiteLLM proxy as the OpenAI endpoint
    "openai": {
      "api_url": "http://127.0.0.1:4000/v1",
      "version": "1",
      "available_models": [
        {
          "name": "codestral-latest",
          "display_name": "Codestral-Latest (via LiteLLM)",
          "max_tokens": 128000
        }
      ]
    },
    // Google provider for Gemini models
    "google": {
      "available_models": [
        {
          "name": "gemini-2.0-flash",
          "display_name": "Gemini 2.0 Flash (Latest)",
          "max_tokens": 1000000
        }
      ]
    }
  },

  // Assistant behavior configurations
  "agent": {
    "version": "2",
    // General "Ask the Assistant" & code edits via Codestral
    "default_model": {
      "provider": "openai",
      "model": "codestral-latest"
    },
    // Default: Inline suggestions via Codestral
    "inline_assistant_model": {
      "provider": "openai",
      "model": "codestral-latest"
    },
    // Default: Commit-message drafts via Codestral
    "commit_message_model": {
      "provider": "openai",
      "model": "codestral-latest"
    },
    // Alternative models for each task
    "default_alternatives": [
      {
        "provider": "google",
        "model": "gemini-2.0-flash"
      }
    ],
    "inline_alternatives": [
      {
        "provider": "google",
        "model": "gemini-2.0-flash"
      }
    ],
    "commit_message_alternatives": [
      {
        "provider": "google",
        "model": "gemini-2.0-flash"
      }
    ]
  }

Actual Behavior: The LiteLLM Proxy returns a 400 Bad Request error with the message indicating an invalid model name. But output is correct in agent window. Expected Behavior: The LiteLLM Proxy should handle the request without errors if the model name is valid.

Zed Version and System Specs

Zed: v0.185.9 (Zed) OS: Linux Wayland fedora 42 Architecture: x86_64

bengtfrost avatar May 07 '25 10:05 bengtfrost

I believe api_url parameter is not respected anymore, so it's directly pointing to openai servers. (I'm using another proxy with same kind of errors)

colinux avatar May 07 '25 15:05 colinux

It is using/respecting api_url parameter - LiteLLM proxy is working and also full output (response). Therefore strange error message from LiteLLM proxy log.

bengtfrost avatar May 07 '25 17:05 bengtfrost

It seems to ignore api_url for me, too.

phirsch avatar May 09 '25 03:05 phirsch

After following https://github.com/zed-industries/zed/issues/27326, it works with a liteLLM proxy (e.g. forwarding to Claude 3.7 Sonnet).

phirsch avatar May 11 '25 08:05 phirsch