gpt4all icon indicating copy to clipboard operation
gpt4all copied to clipboard

Ability to undo/edit previous request, response

Open kelvincht opened this issue 1 year ago • 6 comments

Feature request

Have ability to

  1. edit last request, to get a better response quality
  2. edit previous request/response to tune response quality
  3. select and delete previous request/response to free up unwanted context to increase response quality.

Motivation

I quite like gpt4all because it is easy to setup and "Just works" without complex python setup and complex LLM configs

However one of the limitation of GPT4 compares to LM studio and koboldcpp is the ability to edit previous Request/Response context.

The reasons, is to get good quality response, sometimes I need to remove/tweak previous request/response in middle of context.

Also sometimes if my last prompt didn't get a good response, I want to edit my last prompt to get a quality output

Your contribution

I can provide feedback

kelvincht avatar Aug 09 '23 02:08 kelvincht

I think a similar request has been made here:

  • #1150

Maybe also earlier ones. I'll update this comment if I find more.

Your first request looks like it's doable because it'll only affect one output and input. It'll have to go back to before the change and process everything again, in any case.

However, if you edit the conversation history in another way, then it wouldn't be "in sync" with the model anymore. So I'm not sure about 2 & 3. (Assuming 2 means going back more than one request/response.)

cosmic-snow avatar Aug 09 '23 15:08 cosmic-snow

Looking into this feature currently, as I consider it necessary functionality for my workflow. I'm considering merkle trees (same structure git uses), and won't need larger contexts since it will recalculate anyhow. If something else is desired behaviour wise, I'm amenable to that as well. Please advise.

edit: for clarity, I'm specifically talking about adding the functionality myself, but I don't want to implement it in an undesired way, hence my request for advising.

MaxTheMooshroom avatar Jan 17 '24 23:01 MaxTheMooshroom

I'm considering merkle trees (same structure git uses), and won't need larger contexts since it will recalculate anyhow

That's probably overkill - llama-cpp-python (which ooba's TGWUI uses) just caches the prompt, looks for what changed, and then only decodes the new part - the whole previous conversation is submitted to llama-cpp-python every time it changes. It also implements caching of previous prompts, either in-memory or on-disk, but TGWUI doesn't use it and I haven't personally found it necessary.

I think we should work towards what llama-cpp-python does.

cebtenzzre avatar Jan 18 '24 17:01 cebtenzzre

Click the edit button. then change the text. When editing a User Prompt the text can be resubmitted for a new reply. When editing an Assistant Reply the reply is simply saved and entered. Include an option to delete the record opposite from and or within the edit menu to prevent accidental deletion.

EditLastPrompt

3Simplex avatar Mar 13 '24 14:03 3Simplex

I dare to claim: Adding this feature would improve the data quality which is sent to the datalake tremendously!

ThiloteE avatar Mar 26 '24 12:03 ThiloteE

I really think this issue should be part of the roadmap or at least labeled as medium / high priority feature.

ThiloteE avatar Jun 13 '24 16:06 ThiloteE