Incremental search has poor user experience with large input
Reproduction Setup:
-
Clone https://github.com/torvalds/linux.git and dump the log to a file
git log > log.txt(just to make repeated loading into moar faster will be loaded faster than using moar as the git-log pager). Thelog.txtis about 26m lines and 1.1GB -
Open in moar:
moar log.txt. Let it load all 26m lines (should only take a few seconds). -
Start search, and enter "thenn". Oops you fat-fingered the n and entered it twice.
-
Notice a. The UI appears to freeze b. The pattern display is not updated until a hit is discovered or a failed search through the entire file happens. (there is a single hit for thenn around line 11m) c. Without inspecting a CPU resource monitor you don't know if your computer or moar is locked up or doing something d. (Assuming you realize moar is searching), attempts to cancel the search via ctrl+c do nothing e. If you don't realize moar is searching, and you attempt to edit the pattern (which is stuck rendering the last found version and not the version being searched) then your keystrokes (letters or backspace) produce no immediate effect, but instead queue up input, potentially causing moar to enter a loop of search disaster: fail to find current pattern (or fail to find it quickly), search ends, add next input to pattern, search again, fail, add next input, etc. People tend to repeat the input if an application doesn't acknowledge input quickly.
Expected:
less doesn't have some of the same pitfalls because it doesn't support incremental searching. Incremental searching is always going to suffer from the "you have to search the entire input if the pattern isn't found".
But less clearly shows you when it is searching because the ":" prompt is missing, and a search that takes too long can be cancelled via ctrl-c.
Suggestions: A. moar should update the rendered search pattern prior to starting a search so that the user knows what is actually being searched B. moar should provide a way of canceling a long running search (eg via ctrl-c) C. It would be nice if further editing of the search pattern cancelled the current search and started a new search for the new pattern instead of waiting for a hit/failed search. D. It would be nice if the UI provided feedback to users when moar is searching and it takes more than, eg 250ms E It would be nice if a failed incremental search was represented via UI feedback
I agree about the problem.
I'll look into #296 first though, not reading everything all the time will make this less visible.
I suspect not reading everything at once will make this this more visible, as a failed search would then additionally need to wait for the next set of lines to be read in so that it can continue searching. Ie, the best case for a failed search is when everything is already present in moar's working set.
Actually the search is entirely disconnected from the reading.
So what would happen in this case is that the search would scan only the lines in memory, but not trigger reading of any more lines.
The reader is its own class, presenting an interface that looks like an array of lines. The actual reading is done in the background, so calling these methods multiple times may yield different results as more lines may then be available:
https://github.com/walles/moar/blob/65ca4d02eeeafd7aaf62f1a8d87da8f67a440997/m/reader.go#L42-L49
Ahh, that explains behavior I'd seen that was confusing. When moar is given input that takes time to load (eg because it is large or slowly piped in via stdin), the command to go to the end will jump to whatever arbitrary point in the document moar had most recently loaded with repeated "end" commands jumping ahead by whatever amount moar had loaded in the interim.
I think failing incremental searches just because the pattern was not found in the set of lines currently in moar's memory is bad user experience. It's also a regression from less which supports incremental searching (though not on by default) via --incsearch Less's incremental search will "block" until the pattern is found or the entire input stream/file is searched while still being cancelable (via ctrl-c).
A user provides input to a pager via file or stdin, and they generally have no concept of "the set of lines currently loaded by the pager" and so a failed search for a pattern that is present in the input file (or stream) will be confusing.
go to the end will jump to whatever arbitrary point in the document moar had most recently loaded
Yes.
with repeated "end" commands jumping ahead
No.
At this point, moar should be tailing the input, and additional go-to-end commands should have no effect.
If you have a case where this doesn't work, please open another ticket!
Search is 4x faster starting with this release: https://github.com/walles/moor/releases/tag/v2.9.2
Updated timings for this benchmark (git log linux kernel, search for thenn). Tested on my laptop:
- 8.4s in
moar v1.32.5(current when this issue was filed) - 0.8s in
moor v2.9.4(now)
I still think your suggestions make sense though.