By registering with us, you'll be able to discuss, share and private message with other members of our community.

SignUp Now!

Feature request?: only dedupe files of same size

Seriously, if I have a 5 terabyte folder with 0 files of the same size, dedup should take "zero" time. The hashes don't need to be computed for each file.

So can we add an option so that it only looks at files of the same size?

Or is this already happening and I'm just not realizing?
You can post it to the Suggestions forum.

But that's a very simplistic use case that DEDUPE is not optimized for. DEDUPE is primarily intended to scan multiple directories for duplicates, so it builds a hash table of all the files and then processes the hash table. Adding your test would speed up your single-directory case, but slow down the processing for the normal usage.

Similar threads