jupyter-ai
jupyter-ai copied to clipboard
Make the default chat window memory size configurable
Problem
Chat conversation memory window of 2 is a bit small especially as people are used to long memories from their experiences with ChatGPT.
Proposed Solution
- Make the
kparam in the ConversationBufferWindowMemory of the DefaultChatHandler configurable. - Preferably as a setting in the UI.
- But short-term fix of setting as some global constant like MEMORY_K = 2 which I am able to configure would be fine in the meantime.
- Can consider as provider-level configuration with provider-level default if differing context windows sizes are an issue.
Additional context
@michaelchia Yes, agreed. We set that limit earlier in 0.x because a lot of users were running into token limit issues. We're working on two different tasks in parallel:
- A more robust configuration system
- A better strategy for tracking token usage and limiting prompt size
I will make your issue a priority for when I implement the configuration system.
Related to #218, a major config refactor.
This issue was fixed long ago, but we forgot to close this issue. Users can now use --AiExtension.default_max_chat_history=... to set the number of historical messages passed to the chat model. 🎉