X-Git-Url: https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=finddup.git;a=blobdiff_plain;f=finddup.1;h=2da4ee3dc12e5b24d40182090bd9cfca0d4e2943;hp=e262386a1ec035946a50828f69cf6582b3320429;hb=8c988a4aca00501c9a9d53f4ff228dcb0bce0acb;hpb=ab7b6e26f35ac1dfc88d9bf1e09dd289a30ea782 diff --git a/finddup.1 b/finddup.1 index e262386..2da4ee3 100644 --- a/finddup.1 +++ b/finddup.1 @@ -10,12 +10,13 @@ finddup \- Find files common to two directories (or not) .SH "SYNOPSIS" -\fBfinddup\fP [OPTION]... DIR1 [[and:|not:]DIR2] +\fBfinddup\fP [OPTION]... [DIR1 [[and:|not:]DIR2]] .SH "DESCRIPTION" -With a single directory argument, \fBfinddup\fP prints the duplicated -files found in it. +With one directory as argument, \fBfinddup\fP prints the duplicated +files found in it. If no directory is provided, it uses the current +one as default. With two directories, it prints either the files common to both DIR1 and DIR2 or, with the `not:' prefix, the ones present in DIR1 and not @@ -66,20 +67,24 @@ files with same inode are considered as different None known, probably many. Valgrind does not complain though. -The current algorithm is dumb, that is it does not use any hashing of -the file content. I tried md5 on the whole file, which is not -satisfactory because files are often never read entirely hence the md5 -can not be properly computed. I also tried XOR of the first 4, 16 and -256 bytes with rejection as soon as one does not match. Did not help -either. +The current algorithm is dumb, as it does not use any hashing of the +file content. + +Here are the things I tried, which did not help at all: (1) Computing +md5s on the whole files, which is not satisfactory because files are +often never read entirely hence the md5s can not be properly computed, +(2) computing XOR of the first 4, 16 and 256 bytes with rejection as +soon as one does not match, (3) reading parts of the files of +increasing sizes so that rejection could be done with a small fraction +when possible, (4) using mmap instead of open/read. .SH "WISH LIST" The format of the output should definitely be improved. Not clear how. Their could be some fancy option to link two instances of the command -running on different machines to reduce network disk accesses. Again, -this may not help much, for the reason given above. +running on different machines to reduce network disk accesses. This +may not help much though. .SH "EXAMPLES"