gpt-engineer icon indicating copy to clipboard operation
gpt-engineer copied to clipboard

Applying diffs failing silently

Open TheoMcCabe opened this issue 1 year ago • 7 comments

Expected Behavior

I would expect GPT engineer to either successfully apply all diffs sent by the AI or fail in a way that lets you know which diffs have been applied, which failed, and allows you to manually salvage the failed diff parts by copy and pasting

Current Behavior

The current behaviour seems to be that it applies the sections of the diff which it can and silently throws the rest of the code away. From a users perspective it seems like everything has gone well - but in reality its only applied a portion of the diff.

This is really bad from a usability perspective - for one, a partially applied diff is obviously never going to be working code so applying it is pointless. Also, the knowledge that this is the behaviour pf gpte means i need to manually check every single output to verify its applied the whole diff which is a complete waste of time for diffs which do apply succesfully.

Not applying any of the diffs at all would actually be a better outcome for me, as at least i would have a consistent workflow of copy and pasting... however a more sensible sollution is applying the diffs it can, and if it cant apply a diff for a file, not apply any change to it at all, and instead providing an error output which is convenient for the use to copy and paste manually into the file

Failure Logs

I cant upload failure logs as the code im working on is sensitive

TheoMcCabe avatar Apr 24 '24 12:04 TheoMcCabe

Hello @TheoMcCabe ,

Thank you for bringing up this issue regarding the diff application process. I apologize for the inconvenience you've experienced in your project.

You are correct in your understanding of our current strategy for applying diffs:

  1. Validation and Correction: We first validate and correct the diffs based on format. If a diff fails validation, we attempt an automated self-heal using our LLMs.
  2. Discard Unrecoverable Diffs: If the self-heal process cannot handle the error, we discard these diffs.
  3. Apply Valid Diffs: All corrected diffs are then applied.

Outputs indicating which diffs have been discarded are available in the console and logs. This mechanism is designed to provide a smooth experience for users at all levels and allows for multiple attempts to gradually refine a complex code base.

Additionally, we provide users the option to review and manually decide on applying diffs. You can see the planned changes and make an informed decision at this stage of the process: View Code Here

Would you suggest an interactive approach for applying diffs? For example, showing each validated and corrected diff and allowing users to choose whether to apply them sequentially by user input?

similato87 avatar Apr 25 '24 13:04 similato87

interactively applying diffs im not bothered about - what i find very difficult from a user perspective is that it is very hard for me to know when diffs have failed and havent been applied - you say that ' Outputs indicating which diffs have been discarded are available in the console and logs. ' - i disagree - it is very unclear which diffs have not been applied and this is the problem

When some of my diffs arent applied- the output in the console makes it look like everything has worked fine - this is the bit that needs improving.

My recommendation is that the last thing sent to the user in the console needs to be the diffs which were not sucessfully applied. These need to be outputted into the console in a really easy to read, and copy and paste format... it should use colouring and wording to clearly show that these diffs were not able to be applied and so should be intepreted manually by the user

TheoMcCabe avatar Apr 26 '24 14:04 TheoMcCabe

Do the same diffs work if applied with python-unidiff? ( https://github.com/matiasb/python-unidiff/ )?

Not applying any of the diffs at all would actually be a better outcome for me, as at least i would have a consistent workflow of copy and pasting...

Regarding the use of copy/paste instead of diffs: not having to use diffs would be ideal, but there are limitations of the AI models that make very difficult to get as output the full original code with just the changes made by the AI.

If you want to understand those limitations, I suggest those 2 excellent articles by the Sweep devs:

https://github.com/sweepai/sweep/blob/main/docs/pages/blogs/gpt-4-modification.mdx

and the follow-up:

https://github.com/sweepai/sweep/blob/main/docs/pages/blogs/refactor-python.mdx

Emasoft avatar Apr 28 '24 18:04 Emasoft

Hi,

yes they should work with python-unidiff, IF the AI makes them correct enough to be corrected into exact unified diffs. The general problem is not applying diffs, but that the AI sometimes delivers low quality diffs.

ATheorell avatar Apr 29 '24 17:04 ATheorell

https://github.com/sweepai/sweep/blob/main/docs/pages/blogs/refactor-python.mdx

thanks @Emasoft I'm aware of these sweep articles but good to read them again. You seem to have misunderstood the issue i'm raising here, apologies if I wasn't clear enough.

This issue relates to the behaviour of gpte cli when the AI generated unified diffs are not valid and cannot be applied. Specifically i think there is an issue in how this failure is surfaced to the users.

Im not suggesting we rewrite code files from scratch on every run or change our approach to diffing.

TheoMcCabe avatar May 01 '24 14:05 TheoMcCabe

Hi @TheoMcCabe, sorry for the delayed response—I just returned from my trip today. Axel sent me the files, and I've pinpointed the issue. The problem lies with the diff validation; it failed to correct this Docker hunk, and initially, the problematic hunk wasn’t printed in the console from start to finish.

I will create a PR to: 1) address this failure, and 2) ensure the invalid hunk is printed to both the console and the debug file.

Thanks for highlighting this issue, and I apologize for my oversight regarding the hunk output.

similato87 avatar May 02 '24 02:05 similato87

awesome thanks @similato87

TheoMcCabe avatar May 02 '24 15:05 TheoMcCabe