rdfind
rdfind copied to clipboard
find duplicate files utility
I was running `rdfind` on a backup drive. I got hundreds of these scary messages: > Dirlist.cc::handlepossiblefile: This should never happen. FIXME! details on the next row: Comments in the...
Thanks for this great tool ! I (tried it with a set of directories ("vaults" ) created by dirvish. Here there are as well a huge amount of existing hard...
Somewhat related to #29. I often have directory structures with many files of the same type. The first bytes check performs a very limited decimation of the candidates because the...
Introduce -protectsametree option useful e.g. if you want to delete duplicates from a backup copy that deviated from your original tree (where you may want to keep duplicates) in order...
Hi, I like this utility, because it helps to fix my messed up foto library. However since I have a few ten-thousand files, it takes some time and I miss...
When a directory is mounted over smb via samba on linux, running rdfind inside that directory spams: ``` recursion limit exceeded recursion limit exceeded recursion limit exceeded recursion limit exceeded...
It seems like the -minsize argument is ignored in the commnad: ``` rdfind -removeidentinode false -minsize 100000000 -ignoreempty true -makeresultsfile true -makehardlinks true /tank/ ``` Tons of 16 byte files...
After the run, I see no additional free space, despite the `Totally, 740 Gib can be reduced` in the console. Here is my full output: ``` # df -h /mnt/s4;...
This is after suggestion from SB sending me an email, thanks! While waiting for rdfind to complete, it would be nice to present some kind of feedback in case the...
I tried rdfind with firmware update files. The problem with these is that the first 1000 Bytes are identical and even the last 64 (current default in `Fileinfo.hh`) don't differ...