Non string message content not supported
I updated to V3.0.2 and I'm getting this response from the agent in the Obsidian chat in my MacBook Pro M4.
Console Message: Non string message content not supported
ie @ plugin:copilot:119
I get it for all Groq models, maybe it's a regression.
From which version did you upgrade Copilot? It seems like a setting or a upgrade problem. Could you please try creating a new vault to check the difference?
I get it for all Groq models, maybe it's a regression.
Ah ok. So no problems with other models?
From which version did you upgrade Copilot? It seems like a setting or a upgrade problem. Could you please try creating a new vault to check the difference?
I have the same issue, however I just installed the plugin so I don't think its an upgrade problem.
Please provide the full chat UI and console in one screenshot so that we can see the entire picture.
I'm getting the same issue only with groq models, output is always: "Non string message content not supported".
In my Obsidian, I also encountered this problem, and it was with Groq's API key. I'll try to provide some feedback, even though I'm not an expert:
- I tried chatting or using a template, and I got no results.
- All open notes and the chat interface display an orange attention rectangle, indicating a "Note has not been indexed" error.
- When sending messages in either "chat (free)" or "vault QA (free)" modes, the chatbot consistently returns the error "Non string message content not supported." The "vault QA (free)" mode exhibits a slight delay with a loading indicator before this error appears, whereas "chat (free)" shows it instantaneously.
- I tried using the chat in a vault I had just created, and it didn't work. It produced all the same errors I've reported.
- When I refresh the plugin or force the vault index, it returns this message: "Lexical search builds indexes on demand. No manual indexing required."
- When I close and reopen Obsidian app, it returns this message: "Failed to initialize vector store. Please make sure you have a valid API key for your embedding model and restart the plugin."
- I'm using version 3.0.2 of the plugin and the model "llama-3.3.70b-versatile (Groq)."
I encountered the same problem when using Groq in v3.0.2, then I tested v3.0.1 and v3.0.0 and both had this problem, but it worked well in v2.9.5
just installed it, got the same error message.
@logancyang I can reproduce. The error occurs before copilot sending any data.
@ichts I can't repro.