how to delete memory between two LLMs
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [x] Please do not modify this template :) and fill in all the required fields.
Dify version
official web
Cloud or Self Hosted
Cloud
Steps to reproduce
I have enabled the memory function for both LLMs, but I don't want the memory of the first llm to be passed to the second llm.
✔️ Expected Behavior
I have enabled the memory function for both LLMs, but I don't want the memory of the first llm to be passed to the second llm.
❌ Actual Behavior
No response
To prevent the memory of the first LLM from being passed to the second LLM when using the Gemini model in chat mode, you should disable the memory function on both LLM nodes and use conversational variables instead for maintaining conversation context [1].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
See the first item in the checklist