Support memory in copilot chat.
Some thoughts on the implementation
- Copilot memory can be per model and per project.
- We might consider implement this a new command.
Memorywill be written inmemoriesfiles and added to system prompt.
I'll add here the message that I sent to @logancyang as it contains some interesting ideas that are worth considering:
"Yes, I'm interested in the local memory feature. My main concern with AI tools is usually privacy – I don't like the idea of them storing my "memory." But if it's stored locally, I'm much more comfortable with it.
Here's a simple example of why this is useful: When I ask an AI to explain something "in simple terms," it often oversimplifies, assuming I have no prior knowledge. If the AI remembers that I study engineering, it can tailor its explanations to be more relevant and useful, like a colleague would. It wouldn't need to explain basic concepts like derivatives.
Regarding memory optimization, perhaps you could divide memories into "core" and "secondary" categories. Core memories would be consistently included at the start of each conversation. Secondary memories could be retrieved through a quick AI search using a lightweight model to identify the most relevant memories for the current question.
Additionally, the AI should be able to update its memory over time by searching for, replacing, or modifying older memories as needed."
I would also be super interested in this feature being added. I find that I would like Copilot to remember the kind of formatting I like for my responses.