[BUG] Buffer Window Memory not working as expected
Describe the bug The "Buffer Window Memory" node is no longer applying the limit in window memory that is set.
Issue can be reproduced in 2.0.1, but not in 1.8.3.
To Reproduce
- Create a new basic flow with LLM model, Conversation Chain and Buffer Window Memory.
- Set the window memory size to 0.
- Send a message to the conversation, e.g "I'm Daniel"
- Ask the LLM about the previous information, e.g "What's my name?"
Expected behavior LLM should respond "I don't know" because window memory size is 0, so no messages should be remembered.
Instead, window memory limitation is never applied so all history seems to be stored, regardless of the value set. Can also be reproduced with other values like 1.
Setup
- Installation: docker, pnpm start
- Flowise Version: 2.0.1
- OS: macOS, linux
- Browser: Chrome
I was able to debug it in code and, effectively, we are retrieving an unlimited number of messages in Flowise 2.0.1.
Also reviewing the code, another thing that I noticed, not related with the original issue, is that we are taking take: this.k + 1. This means that we will take 1 message more than the window size that the user set.
This doesn't match the description of the parameter: Uses a window of size k to surface the last k back-and-forth to use as memory. If I set it to 0 in Flowise 1.8.3, we are returning 1 message. If I set it to 10, we are returning 11 messages.
Should that logic be take: this.k * 2 instead?
yes I think you are right, looking at the source code from langchain, its also messages.slice(-this.k * 2)
Fixed in this PR https://github.com/FlowiseAI/Flowise/pull/3242