mercredi 26 janvier 2011

effacer doublons de fichiers en utilisant md5

source

http://www.commandlinefu.com/commands/view/2370/delete-all-those-duplicate-files-but-one-based-on-md5-hash-comparision-in-the-current-directory-tree

2009-06-07 03:14:06
User: masterofdisaster
Functions: find perl rm xargs
find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --



removed `./duplicate0.mp3'
removed `./1/duplicate1.mp3'
removed `./2/duplicate2.mp3'


DELETE all those duplicate files but one based on md5 hash comparision in the current directory tree
This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories).
Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd.
md5sum can be substituted with sha1sum without problems.

The actual filename is not taken into account-just the hash is used.
Whatever sort thinks is the first filename is kept.
It is assumed that the filename does not contain 0x00.

As per the good suggestion in the first comment, this one does a hard link instead:
find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'


  • Alternatively, if you want to keep the appearance but still save space, you can use the "hardlink" command. This searches recursively for duplicates and replaces them all with hard links.

  • fdupes its just simple! If you just want one of the files type: fdupes -fN

Aucun commentaire:

Enregistrer un commentaire