Don't try to make more hard links than the fs allows
Many filesystems have 16bit (or even smaller) link counters, eg. ext4 has a limit of 65000 links per file. If there are more duplicate files than that, rdfind tries to link all of them into one, and it will fail to make links after the limit has been reached.
If the limit has been reached, the next processed copy of the file should be used as a target for subsequent duplicates instead of replacing it with a link (and so on if the limit is reached again).
The maximum number of links for any given file or dir should be able to be found with pathconf/fpathconf and the current link count with stat/fstat
Who would ever have 65000 identical files :-)
Joke aside, sure this is a real problem. It does not seem trivial to fix. I have to think about this one. Thanks for the tip on https://linux.die.net/man/3/pathconf