scanmem icon indicating copy to clipboard operation
scanmem copied to clipboard

Undoing searches

Open vikke1234 opened this issue 5 years ago • 6 comments

I was talking with @12345ieee about adding an option for undoing to scanmem. Something like "undo_enabled" with a limit you could set e.g. "undo_limit"

For example, if you looked for a 1 in the program after which you look for 4. Searching for the 4 was a mistake so you'd want to go back to the 1, so you'd do the undo. I don't think needing to support more than the 1 undo is necessary.

vikke1234 avatar Apr 19 '20 18:04 vikke1234

This could be done, but you'd have to redo the search due to the possibility of the matches being moved around in memory. If I remember correctly, someone has requested something involving saving matches and @sriemer said it's unfeasible ATM.

bkazemi avatar Apr 25 '20 21:04 bkazemi

I was simply thinking about saving the old matches array instead of replacing it with the new one.

Of course stuff may have moved in the meantime, but that's an issue with any consecutive scan, irrespective of undo.

Did I miss something?

12345ieee avatar Apr 25 '20 21:04 12345ieee

https://github.com/scanmem/scanmem/issues/341 same question right? Personally I like the idea, if you have enough memory, it'd be convenient. We could add an option to automatically prune the addresses that don't match the search in the old matches.

bkazemi avatar Apr 26 '20 00:04 bkazemi

Yeah, it's the same thing. The issue I've always with this was that keeping N copies of the matches array is easily implemented, but what's hard is to choose N.

The OP of this issue suggested I could add a configurable number of arrays to retain, so we can have it at0 by default (= old behaviour) but let front ends move it to 2 or 10 or 3000, let them or the user handle the decision of how much history to keep.

Of course the more you wait before using old data, the more and more it becomes garbage, that's just how it is...

12345ieee avatar Apr 26 '20 00:04 12345ieee

Sounds good to me. Perhaps we could check memory usage and show a warning if the user starts using enough RAM to slow down their system; With today's RAM, only with a high N would that be likely, however. Maybe we could even generalize this warning, it seems a lot of people have blown their stack.

bkazemi avatar Apr 26 '20 18:04 bkazemi

If swap is enabled on their system it should be still okay, but I noticed it does indeed completely crash the system when there isn't any ram and swap memory left still. What could work is mapping virtual memory to a file (under /tmp for example) where it saves the history of matches, it might be slower unfortunately but should be safer than using RAM directly if matches are too big. Nowadays disks have atleast more than 500GB and 1TB it should be way more than enough (I hope) to save these memory addresses and values I suppose (I still use a 120GB disk though haha)

For the way to store them in my opinion it would be nice to do unscans as much as we want so we can get back in the first search. Maybe keeping an integer where it keeps incrementing for a group of matches that were left behind when doing a "next" scan so we can go on X previous scan and clear them out when a "new" scan is done.. Something like that. Well of course I guess it will take much more memory but as you said guys we can limit it. Though, I don't think limiting is a good option, because sometimes you might want to search a boolean value that is a byte and there is a lot of 01 & 00 inside a program so yeah.. But it's up to you of course.

XutaxKamay avatar Jun 28 '20 22:06 XutaxKamay