Makin’ Copies

Backups are the New Year’s Resolutions of computing. You start out with good intentions, and by the end of January you’re eating Twinkies and you haven’t backed up in weeks. Prior to my iMac’s premature death, by backup strategy was the electronic equivalent of keeping your savings under the mattress- I burned my iPhoto Library to DVDs, copied a few things to another machine when I thought of it, and mostly forgot about it. I kept meaning to do something about it, and then I kept not doing anything about it. I was also dragging my feet because I was waiting for the fancy new Time Machine feature in Leopard, originally due out in June. Nothing could possibly go wrong before Leopard comes out, right?

Of course, nothing leads to good backups like systems failure and data loss. This time I was lucky- my data loss was only loss of access to my data; once the iMac was fixed, all my data was right where I left it. Had my dead logic board been a dead hard drive, I would been up the proverbial creek, which empties into the Bay of Despair, where lies the Isle of Self Recrimination. And so it was, that after unpacking the Mac and confirming everything worked, and after a brief aside (read: reboot) for the latest Mac OS X security update, I started working on a backup plan.

My long term (say, 6 months tops) plan is to have automatic backups to some kind of RAID setup. Storage just keeps getting cheaper, and RAID is no longer requires rocket surgery to use. For USD 400.00 at my local Apple Store, I can get a terrabyte in box from Western Digital, usable as either RAID 0 (no protection, seen as one big drive) or RAID 1 (half the storage, but double the copies) via USB 2.0 and Firewire 400/800. For USD 500.00 I can buy a Drobo (be sure to watch the video), a USB 2.0 (sure wish it had Firewire 800) self managed RAID box that can sport 1 to 4 drives of any mix of sizes, and intelligently manage the RAID automatically. Of course, I’d still have to shell out for drive(s), but the beauty there is that you can only buy as much storage as you need (any brand SATA drive), allowing storage prices to continue to drop. Also, it’s a one-time investment; if you run out of room and free drive bays, you replace smaller drives with bigger drives, one at a time, as needed. Either of these solutions could be hung directly off my Mac for use with Time Machine, or attached to my Linux server or an NAS for a networked setup.

In the interim, I’ll settle for a nightly copy of the data I care about- In my case, that’s the /Users directory and everything in it. I have an old beige box in the basement running Linux as a headless server, among other uses it hosts 160GB of storage via Samba. Keeping my ~85GB of /Users data copied onto the 160GB drive should be good enough for now- if either machine goes, I’ll have one good copy.

To maintain my copy, I am using rsync (see also the Wikipedia entry). I recalled that there are some issues with rsync on OS X, including the fact the the HFS file system used by OS X supports resource forks, which most other file systems do not. A bit of the Google turned up this rsync on OS X Mini-HOWTO at that nicely sums up the issues and offers a patch for Apple’s version of rsync. I compiled a new rsync on both my Mac and the Linux server with no issues, and proceeded to make a full backup of /Users to the remote system. For the curious (and for my own future reference), the particular rsync incantation I used was:

sudo /usr/local/bin/rsync -aREx --delete /Users/ user@server:/var/local/sync/iMac/

Where of couse user and server are placeholders, and /var/local/sync/iMac/ is a directory I carved out to hold the backup. I already had SSH setup between the Mac and the server; any decent rsync guide can elaborate on that if you don’t know the dance. The next step is to automatically run this command nightly, which will require a little re-jiggering; my default SSH key is protected by passphrase. I found one way of automating rsync, but I plan to do a little more research. For the next day or three, I’ll run it manually before bedtime.

Both comments and pings are currently closed.

4 Responses to “Makin’ Copies”

  1. - Drobo Says:

    […] In my previous post, I mentioned the Drobo as a possible option for my plan to eventually move to RAIDed storage for my backups. I forgot to mention that I expected to see a review from Sean based on this teaser post a few days ago. It looks like he pulled the trigger, complete with 2 terrabytes of drive space. Money quote: […]

  2. Maria Says:

    If you have a .Mac account, you can use Backup to back up up to 10 GB of your most precious files to your iDisk.

    Or, if you want to retain the file/folder structure, just set up your iDisk to automatically synchronize and store all your important files there.

    OR you can use Fetch or some other FTP client to automatically mirror folders containing important files to the FTP server for your Web site. Depends on how much disk space they give you, though.

    I do ALL of these things. Why? I’ve lost 3 hard disks in 15 years. Live and learn.

  3. Daniel Aleksandersen Says:

    Tanks for sharing the link to the how-to on Mac and rsync. It even proved helpful on my KUbuntu system!

  4. Jason Says:

    Maria – I don’t use .Mac, and I’ve always been frustrated by the fact that Apple’s Backup only works with .Mac… Backups are far too important to be an add-on feature; that’s one reason I’ve been looking forward to Time Machine in Leopard.

    I definitely like the idea of off-site backup, although size is an issue. My current backup is 85G; Far too large for .Mac, although my hosting plan at Dreamhost (disclosure: referral link, I get a kickback if you sign up) currently has over 520G available. If I go to offsite backups, I’ll definitely stick with rsync, now that I have it working. The time for the inital upload would be a killer though.

    I could probably trim that 85G down, but not as much as you might think. About 33G of that is iTunes music, which doesn’t absolutely require backups, but almost 20G is my iPhoto library, which I want backed up. I do export all the photos to DVD from time to time, but I only update those once or twice a year.

    I probably should do an inventory of the folders that are important enough to need offsite backup in addition to my local backups, as you say. It will take a bit more work to determine exactly what to backup, but it’s probably worth it in the long run, as you point out.

    I wonder if rsync supports (or could support) an encyrpted volumn on the other end? Not sure how I feel about syncing all of my docs to a server not under my physical control. Hmm, research for another day.