zed icon indicating copy to clipboard operation
zed copied to clipboard

Agent Panel: Not having responses from LM Studio

Open ngocphamm opened this issue 8 months ago • 0 comments

Summary

When connecting to LM Studio, I can't seem to make the chat work. Only receive the first token/word, or sometimes nothing at all as the response.

Description

Steps to trigger the problem:

  1. Connect to LM Studio, using default settings
  2. Select a model, says mlx-community/llama-3.2-3b-instruct
  3. Chat with it

Actual Behavior:

No response, or only the first word/token is being shown.

Expected Behavior:

Receive responses from LM Studio models normally

Additional information

How Zed chat window looks like

Image

This is from debug log inside LM Studio

2025-04-30 07:25:16 [DEBUG] 
Received request: POST to /api/v0/chat/completions with body  {
  "model": "llama-3.2-3b-instruct",
  "messages": [
    {
      "role": "user",
      "content": "\n\n\nHello.\n"
    },
    {
      "role": "user",
      "content": "Generate a concise 3-7 word title for this convers... <Truncated in logs> ...like `Here's a concise suggestion:...` or `Title:`"
    }
  ],
  "stream": true,
  "max_tokens": -1,
  "stop": [],
  "temperature": 0,
  "tools": []
}
2025-04-30 07:25:16  [INFO] 
[LM STUDIO SERVER] Running chat completion on conversation with 2 messages.
2025-04-30 07:25:16  [INFO] 
[LM STUDIO SERVER] Streaming response...
2025-04-30 07:25:16 [DEBUG] 
[mlx-engine] Stop string '<|eom_id|>' not found in final text segment, even though a full stop was detected. Not trimming final segment.[CacheWrapper][INFO] Trimmed 24 tokens from the prompt cache
2025-04-30 07:25:16  [INFO] 
[LM STUDIO SERVER] First token generated. Continuing to stream response..
2025-04-30 07:25:16  [INFO] 
Finished streaming response

Zed Version and System Specs

Zed: v0.183.12 (Zed) OS: macOS 15.4.1 Memory: 32 GiB Architecture: aarch64

ngocphamm avatar Apr 30 '25 11:04 ngocphamm