ugrep
ugrep copied to clipboard
ugrep as find (fd) replacement
Hello. Is it make sense in terms of efficient and performance to use ugrep as a replacement for find utils like gnu find or fd?
Thanks for the feedback. I like your suggestion. Right now, ugrep can be used as a simple find -path GLOB -print
utility with the ugrep -g/GLOB ""
option using a pattern that matches anything, i.e. ""
, and find -name GLOB -print
with ugrep -gGLOB ""
. With ugrep you can also control the recursion depth with --depth=n
and short options -1
, -2
, -3
etc. While find
does not search the file contents to match a regex pattern, it has lots more options to specify search constraints on files, as you know, using logical connectives. So it's a different beast, but some things that are handy with find
could be considered for ugrep
I suppose. Do you have a specific use case or requirements that find
alone does not meet and ugrep
alone does not meet?
Sorry for the late reply. I have no incompatibles between fd and ugrep. The main question was in efficiency. I read ugrep use some algorithms which are better than in grep or rg. I'll try to simplify my initial question. Can file searching be faster and more efficient with ugrep than with fd?
Can file searching be faster and more efficient with ugrep than with fd?
I would think so, since find
doesn't appear to search in parallel like ugrep does.
ugrep
and find
are different beasts through. Running ugrep
with a glob (see previous response) and then --filter
can run the specified --filter
command on the files found.
For example, ugrep '' -g'*.txt' --filter='*:stat %'
displays the stat
output for all txt
files recursively found.
Another example, ugrep -3 -c '' -g'*.zip' --filter='*:unzip %'
will expand all .zip files recursively up to 2 more subdirectory levels deep in the working directory (use unzip -t %
to try out without expanding the files).
The --filter
option is abused to do this (with *
to match any file extension so -g
just selects whatever you want). That's fine. For the second example I used -c
to output the number of lines produced by unzip
. Using -l
to list may work, but there can be problems with the way this option terminates the pipe to the process early, without consuming all unzip
output, thereby interrupting the unzip. Perhaps I should think of a way to avoid this and let -l (and -q) with --filter
always consume all process output.
I'm sure some clever folks can come up with some examples for practical use cases to use --filter
this way. If someone has ideas to push this further, then I very much like to hear more.
@genivia-inc thanks for the reply. It's quite interesting.
You've mentioned find. But how about comparison with fd? https://github.com/sharkdp/fd
BTW it would be cool to see similar benchmark for ugrep|find|fd as you've provided for ugrep|grep|rg|etc.
A use case that came to my mind immediately upon trying ug -Q
(interactive mode), is that it is similar to the fzf
/fzy
user experience, and it would be nice to be able to search for file names interactively instead of (or narrow by file names in addition to) file contents. Maybe there's a way to do this with ugrep
and I just haven't figured it out?
You can use ALT-g
to specify globs interactively in the TUI.