jupyter-ai icon indicating copy to clipboard operation
jupyter-ai copied to clipboard

Make the default chat window memory size configurable

Open michaelchia opened this issue 2 years ago • 2 comments
trafficstars

Problem

Chat conversation memory window of 2 is a bit small especially as people are used to long memories from their experiences with ChatGPT.

Proposed Solution

  • Make the k param in the ConversationBufferWindowMemory of the DefaultChatHandler configurable.
  • Preferably as a setting in the UI.
  • But short-term fix of setting as some global constant like MEMORY_K = 2 which I am able to configure would be fine in the meantime.
  • Can consider as provider-level configuration with provider-level default if differing context windows sizes are an issue.

Additional context

michaelchia avatar Jul 27 '23 15:07 michaelchia

@michaelchia Yes, agreed. We set that limit earlier in 0.x because a lot of users were running into token limit issues. We're working on two different tasks in parallel:

  1. A more robust configuration system
  2. A better strategy for tracking token usage and limiting prompt size

I will make your issue a priority for when I implement the configuration system.

dlqqq avatar Jul 27 '23 18:07 dlqqq

Related to #218, a major config refactor.

JasonWeill avatar Aug 28 '23 18:08 JasonWeill

This issue was fixed long ago, but we forgot to close this issue. Users can now use --AiExtension.default_max_chat_history=... to set the number of historical messages passed to the chat model. 🎉

dlqqq avatar Feb 04 '25 23:02 dlqqq