finddup - Find files common to two directories (or not)
finddup [OPTION]... [DIR-OR-FILE1 [[and:|not:]DIR-OR-FILE2]]
With one directory as argument, finddup prints the duplicated files found in it. If no directory is provided, it uses the current one as default.
With two directories, it prints either the files common to both DIR1 and DIR2 or, with the `not:' prefix, the ones present in DIR1 and not in DIR2. The `and:' prefix is assumed by default and necessary only if you have a directory name starting with `not:'. Files are handled like directories containing a single file.
This command compares files by first comparing their sizes, hence goes reasonably fast.
When looking for identical files, finddup associates a group ID to every content, and prints it along the file names. Use the -g to switch it off.
Note that finddup DIR is virtually the same as finddup -i DIR DIR
None known, probably many. Valgrind does not complain though.
Since files with same inodes are considered as different when looking for duplicates in a single directory, there are weird behaviors -- not bugs -- with hard links.
The current algorithm is dumb, as it does not use any hashing of the file content.
Here are the things I tried, which did not help at all: (1) Computing md5s on the whole files, which is not satisfactory because files are often not read entirely, hence the md5s cannot be properly computed, (2) computing XORs of the first 4, 16 and 256 bytes with rejection as soon as one does not match, (3) reading files in parts of increasing sizes so that rejection could be done with only a small fraction read when possible, (4) using mmap instead of open/read.
The format of the output should definitely be improved. Not clear how.
There could be some fancy option to link two instances of the command running on different machines to reduce network disk accesses. This may not help much though.
finddup -p0d blah
List duplicated files in directory ./blah/, show a progress bar, ignore empty files, and ignore files and directories starting with a dot.
List all files which are duplicated in the current directory, do not show the oldest in each each group of identical ones, and do not show group numbers. This is what you could use to list what files to remove.
finddup sources not:/mnt/backup
List all files found in ./sources/ which do not have content-matching equivalent in /mnt/backup/.
finddup -g tralala cuicui
List groups of files with same content which exist both in ./tralala/ and ./cuicui/. Do not show group IDs, instead write empty lines between groups of files of same content.
Written by Francois Fleuret <email@example.com> and distributed under the terms of the GNU General Public License version 3 as published by the Free Software Foundation. This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law.