Welcome!

By registering with us, you'll be able to discuss, share and private message with other members of our community.

SignUp Now!

DEPUPE Crashes and Takes Out The Current TCC Session With It

Dec
231
2
If I run:

DEDUPE /S /V /H *.* A:\

DEPUPE crashes and takes out the copy of TCC that it is running on. My guess is that running "/S", on the root of a drive "A:\" and "*.*" (basically everything on the drive) will generate a very large dataset. This dataset is probably to large for DEPUPE to handle and cause a buffer overflow.

Also noticed that there is a "/T" switch on the command line description on "DEDUPE /?". Any idea what the switch does?

Thanks
 
Last edited:
"/T" is even in the help file too (in the "Format:" line). Unfortunately there also with no further explanation ...
 
Code:
DEDUPE /S /V H *.* A:\

What's "H"?

Sorry that should have been the "/H" switch. On a backup drive I have multiple backups of directories from multiple computers. This allows for multiple duplicate files across multiple directories, but since it is a backup a duplicate file only needs to be stored once and hard links for any other duplicate. This could save a lot of space in the case of videos, photos and music.

Actually, I wanted to run something like:

DEDUPE /S /V /H *.MP4 A:\

and

DEDUPE /S /V /H *.JPG A:\

on my backup drive.
 
Last edited:
At one time I had a slow network drive connected to my router and I would keep Images there and copy them into projects on my main drive. Eventually, external drive came down in price, increased in size and were portable, So copied the network drive's content to a 8 TB external. This drive was also used to backup the projects I created with the images from the network drive.

I ran the following on the network drive:

DEDUPE /S /V /H *.JPG A:\

and the summary was:

Total files: 217848 Unique: 39085 Duplicates: 127791

That should free up a lot of space on my backup drive.
 
There is no /T option. I have removed it from the help.

I cannot reproduce the crash. I tried it on a drive with 732,107 files and 178,069 dirs.

Unless you're running a very memory-constrained (or disk-constrained if you run out of swap space) system (or an x86 build) you should have ample virtual memory for DEDUPE. Though DEDUPE'ing a large drive will take a long time (and is discouraged in the help).
 
The drive I ran the original command on was:

1,277,901 Files and 389,034 Directories.

Bigger than the drive you ran it on.
 
Last edited:

Similar threads

Back
Top