X-Git-Url: https://www.fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=finddup.git;a=blobdiff_plain;f=finddup.1;h=896cc88a4090a8fc9e85f965bd52424d56d60c8e;hp=b279216e3ae94fddc0bfc0d7c2eaee3ec47655af;hb=HEAD;hpb=c255c34c53dfe5dddd7d12f9694c24eaf8e2b2df diff --git a/finddup.1 b/finddup.1 index b279216..896cc88 100644 --- a/finddup.1 +++ b/finddup.1 @@ -10,7 +10,7 @@ finddup \- Find files common to two directories (or not) .SH "SYNOPSIS" -\fBfinddup\fP [OPTION]... [DIR1 [[and:|not:]DIR2]] +\fBfinddup\fP [OPTION]... [DIR-OR-FILE1 [[and:|not:]DIR-OR-FILE2]] .SH "DESCRIPTION" @@ -21,7 +21,8 @@ one as default. With two directories, it prints either the files common to both DIR1 and DIR2 or, with the `not:' prefix, the ones present in DIR1 and not in DIR2. The `and:' prefix is assumed by default and necessary only if -you have a directory name starting with `not:'. +you have a directory name starting with `not:'. Files are handled like +directories containing a single file. This command compares files by first comparing their sizes, hence goes reasonably fast. @@ -94,7 +95,7 @@ file content. Here are the things I tried, which did not help at all: (1) Computing md5s on the whole files, which is not satisfactory because files are -often not read entirely, hence the md5s can not be properly computed, +often not read entirely, hence the md5s cannot be properly computed, (2) computing XORs of the first 4, 16 and 256 bytes with rejection as soon as one does not match, (3) reading files in parts of increasing sizes so that rejection could be done with only a small fraction read @@ -104,7 +105,7 @@ when possible, (4) using mmap instead of open/read. The format of the output should definitely be improved. Not clear how. -Their could be some fancy option to link two instances of the command +There could be some fancy option to link two instances of the command running on different machines to reduce network disk accesses. This may not help much though. @@ -117,6 +118,14 @@ List duplicated files in directory ./blah/, show a progress bar, ignore empty files, and ignore files and directories starting with a dot. +.B finddup -qtg + +.fi +List all files which are duplicated in the current directory, do not +show the oldest in each each group of identical ones, and do not show +group numbers. This is what you could use to list what files to +remove. + .P .B finddup sources not:/mnt/backup