What backup tools/strategies do you recommend?

  • This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.
Just want to know what backup tools, utilities, or strategies do you recommend for the data on your local PC or laptop? I don't want to clone the entire drive (e.g., I would omit things in "\Program Files\" and "\Program Files (x86)\"), but I do want to keep things in my \User\{userid} directory, plus in 2 or three directories I've created off the root of C:\.

Does anyone use a special technique for clearly segregating user data (files created by you or for you, including config files for your setup) from program data and infrastructure, which will get re-created with each new install?

Do you always do "full backups" or do you use one full backup followed by incremental backups or other types of backups?

I'm running Windows 8, if it matters. Thanks in advance for any suggestions.
May 26, 2008
CrashPlan! I have been using it for about 5 months now and think it is amazing.

The FREE edition can back up to your other computers or even to friends computers. Your friends have a separate CrashPlan account and will not be able to read your backed up data stored on their computers. You can back up to multiple destinations simultaneously. You can even "seed" a backup at a remote friend's site by first backing everything up to a USB drive on your computer (for example), taking that USB drive and copying the encrypted backup to their local computer, and then configure your CrashPlan software to start backing up to their computer. It will detect that the data is already there and will only need to synchronize changes over the internet from that point on.

The backups are deduplicated, compressed, and encrypted. Only changed blocks in files are transferred to the backup location each time a backup is run. CrashPlan can retain unlimited versions of files.

The PAID version lets you back up an unlimited amount of data to their cloud storage, and lets you schedule backups to run as often as 1 minute intervals. Because only changed blocks of data are transferred, the backups are usually very fast. (The FREE version only backs up once per day automatically, although you can trigger manual backups more frequently.)

I back up almost everything on my system except for program file areas, the Windows folder, etc. I have the paid version on my main PC and back up almost everything every 15 minutes. In my software development folders I actually back up files every 1 minute. With the unlimited file version retention, I can easily restore files to any earlier point if I screw something up.

I personally back up to multiple locations: cloud, another computer in my house, and then my dad's computer at his house. (My dad cannot read my backed up data as he is using a different CrashPlan account.)

Definitely check it out.

On top of this I use the Windows 7 Backup feature to regularly back up an entire system image of my C: drive to a network location. If my C: drive dies, I can get my system up and running quickly (with all programs and settings) by restoring this image file.
May 26, 2008
For 15 years before CrashPlan I was a big fan of using Backup Exec to back up to tape. (I work in IT and that's how we would back up servers reliably.) I used to perform "full" backups once a week and then "incremental/differential" backups the other days of the week, appending to the same tape used for the full. Tape would be changed every week. I had about 10 tapes so the number of file versions I could restore back to was basically 1 per day for the past 10 weeks.

CrashPlan is insanely better in this regard. I can back up multiple times per day and retain infinite backup versions.

After I used CrashPlan for a while I reduced my tape backups to just once a week and I'll probably stop doing them entirely soon.
Nov 2, 2008
I keep everything that needs to be backed up under l:\save. This is about a cdrom worth, which is dropped to image regulary, and burnt every now and then. The cd burner is a different box.
I have never fully trusted a pure data-only backup. I'm afraid that something will get lost program-wise that will prevent me from being able to access certain data (e.g., Waves and Native Instruments licenses, etc.). For that reason, I've always used multiple strategies (call it overkill if you wish - I've learned my lesson the hard way).

On my laptop, I do a pair of full backups, using the Clonezilla distro and a program on the SystemRescueCD distro called FSArchiver. Both of these are capable of backing up a Windows NTFS partition, despite them being Linux utilities. On my Desktop system, I use FSArchiver and Novastor's NovaBACKUP in Disaster Recovery mode. (I'd use NovaBACKUP on the laptop, but for some reason it refuses to do a verifiable Disaster Recovery backup on it. Even Novastor was stumped - they gave me a refund on the copy I'd bought for it.)
Nov 2, 2008
The back-up strategy i use is to work around the fact that data is fast/slow moving and is outside-recoverable or not. Size matters. The unit of backup is the cdrom, but the mingw32 version of 'mkisofs' is quite capable of creating duplicates-once, so copying files from a directory to a pathed directory has no impact. ISO files can be mounted as diskettes.

Projects that move very slowly (eg the resource-kit and bitmap collections), are backed up infrequently.

The help-file projects (welcome.hlp for w3.1, the OS/2 read.inf, and the dos7 help.hlp) are classed as 'slow moving'.

The vast horde of smallish applications that fill \newin\ and a few other locations, are slow moving as well, but restoring is faster than installing, so these are randomly backed up. A pathed directory in this lot takes no space on the disk by virtue of 'duplicates-once'.

The bulk of the 'fast-moving' stuff, including my internet links (which is stored as a web page set), is kept at cdrom size.

The idea is that a new box might be completely up and running from processing three or four cdroms, and running a handful of batch files after editing just one of these - we use the windows registry at hklm\software\wendy\paths\ - to store most of the info.
May 26, 2008
I have never fully trusted a pure data-only backup. I'm afraid that something will get lost program-wise that will prevent me from being able to access certain data (e.g., Waves and Native Instruments licenses, etc.). For that reason, I've always used multiple strategies (call it overkill if you wish - I've learned my lesson the hard way).
I use multiple as well. I didn't mention it in my earlier post, but I also utilize Windows 7's native Backup app to do a system state image which is saved to another computer on the network. This allows for very quick recovery of the entire system in case my C: drive blows up.

Crash Plan gets me offsite, continuous protection of my data files. Can't recommend it enough.
I've found the first thing in backing up a system is considering backup when designing and building it.

My PC had a single 2TB drive. I partitioned my C: drive to be 32GB; my data disk (D:) is about 200GB, and other drives based on function. My M: drive is music, for example. I have an external 2TB eSata disk (B:), also 2TB in size, a networked PC with a 2TB drive (P:), and a 2TB USB drive I keep in a fireproof safe.

The reason for having a 32GB C: drive was specifically for backups; you can image a drive like that in about ten minutes, and write it to any number of 32GB media options; hell, I picked up a 32GB USB 3.0 thumb drive the other day for twelve dollars. This also really helped when I picked up an SSD disk for my C: drive; I just imaged my existing partition to it and I was set. Friends who had 500GB C: drives went through hell trying to figure how to generate a C: drive which was current with all their config data and installed programs, etc.

Every morning at 4am, a cron job kicks off - a TCC script, which does various things on various days. On Tuesdays, it generates an image file of the C: drive, and saves it in a directory on the D: drive. On Wednesday through Monday, the D: drive is incrementally backed up to the network P: drive (which will back up the image of the C: drive to the network), as well as differences on any other data drives. I've found the best command line file copy utility is the Robocopy tool (free download up to XP; built into Win7 and on), and the best Windows based backup is SyncBack. Should my primary PC die, the secondary PC will have all the files from it current to 4am the morning it crashed. It will also have a bootable Macrium image file of the C: drive, which in the worst case scenario would be 8 days old.

I used to automatically close the command window on completion, but I could potentially miss a fault, so I don't do that any more. Every morning, when I turn my monitor on, I see the window showing the status of the successful backup. If I don't, or if it shows a failure, then I can take corrective action immediately.

Of course, there's always the possibility that a fire or somesuch could wipe out both my primary and secondary PCs, which is why I have the fireproof safe. The first weekend of the month, I take the 2TB disk out of the fireproof safe, and run another TCC script, which clears the destination drive, and then images my C:, D:, and other drives to the USB disk. That takes a few unattended hours, and then the drive is tossed back into the safe for another month.

Now, show all of *that* fail, I also have two 2TB encrypted 2.5" drives, on which I make an image of my system monthly. I rotate them, keeping one offsite, and one at home. Should my place burn to the ground, and even the fireproof safe not be saved, I will have an offsite backup that is at worst case 29 days old. Yes, I realize that at any given time, my PC data has no less than five separate backup copies, including one stored offsite. But with the exception of the monthly backup, which takes all of about five minutes to hook up a USB drive and start a shell script in a command window, it's completely automated.