llm.nvim
llm.nvim copied to clipboard
feature request: show the error message when response coudn't be serialized
Hi, when I set invalid model for my endpoint then the endpoint responds with such message, but it can be any message:
{"object":"error","message":"The model `qwen2.5-coder-1.5B` does not exist.","type":"NotFoundError","param":null,"code":404}
So I know that I made an error and can fix it easily.
However, in neovim I only see very unhelpful unreadable message that doesn't immidately say anything to me:
[LLM] serde json error: data did not match any variant of untagged enum OpenAIAPIResponse
I can only connect with tcpdump -A (which requires root) to get the message.
Please kindly add the response to the notification when displaying it in neovim.
The message can look like this:
[LLM] serde json error: data did not match any variant of untagged enum OpenAIAPIResponse: {"object":"error","message":"The model `qwen2.5-coder-1.5B` does not exist.","type":"NotFoundError","param":null,"code":404}
A better notification could suggest the state the program is in and suggest a solution, for example:
Connection to the URL was created, but the message received was unreadable. The most common cause is the URL is invalid or the parameters send to it are unacceptable by the URL. [LLM] serde json error: data did not match any variant of untagged enum OpenAIAPIResponse: {"object":"error","message":"The model `qwen2.5-coder-1.5B` does not exist.","type":"NotFoundError","param":null,"code":404}
Thank you, the plugin is amazing and I am using it daily. You are great, thanks.