Error Void: Response from model was empty.
I deployed ollama run qwen2.5-coder:1.5b locally. When I use chat/agent, the error message is Void: Response from model was empty. However, I can communicate normally in PowerShell. What's going on?
Thanks for the report! Can you provide which OS you were using? I'm wondering if it's an endpoint issue.
Thanks for the report! Can you provide which OS you were using? I'm wondering if it's an endpoint issue.
win10 professional 22h2 (19045.5371)
By the way, I installed the latest version 1.2.1 today, and when using the API provided by deep seek, an error message is displayed: Void: Response from model was empty.
This should be fixed in 1.2.5, but will keep this open. @COOPER-DENG please confirm if it works!
This should be fixed in 1.2.5, but will keep this open. @COOPER-DENG please confirm if it works!
Thank you for your reply. I installed 1.2.5.25107 today and it has responded, but it takes several minutes to respond. It feels too slow. (I used deepseek api / openRouter model / local ollama qwen2.5-coder:1.5b for testing)