In my previous post, I mentioned the Drobo as a possible option for my plan to eventually move to RAIDed storage for my backups. I forgot to mention that I expected to see a review from Sean based on this teaser post a few days ago. It looks like he pulled the trigger, complete with 2 terrabytes of drive space. Money quote:

It took more time to open the packaging than actually doing the install.

I’m looking forward to some further impressions, especially with regard to transfer speed. As a professional photographer, he shuffles some serious tonnage of data around.

Makin’ Copies

Backups are the New Year’s Resolutions of computing. You start out with good intentions, and by the end of January you’re eating Twinkies and you haven’t backed up in weeks. Prior to my iMac’s premature death, by backup strategy was the electronic equivalent of keeping your savings under the mattress- I burned my iPhoto Library to DVDs, copied a few things to another machine when I thought of it, and mostly forgot about it. I kept meaning to do something about it, and then I kept not doing anything about it. I was also dragging my feet because I was waiting for the fancy new Time Machine feature in Leopard, originally due out in June. Nothing could possibly go wrong before Leopard comes out, right?

Of course, nothing leads to good backups like systems failure and data loss. This time I was lucky- my data loss was only loss of access to my data; once the iMac was fixed, all my data was right where I left it. Had my dead logic board been a dead hard drive, I would been up the proverbial creek, which empties into the Bay of Despair, where lies the Isle of Self Recrimination. And so it was, that after unpacking the Mac and confirming everything worked, and after a brief aside (read: reboot) for the latest Mac OS X security update, I started working on a backup plan.

My long term (say, 6 months tops) plan is to have automatic backups to some kind of RAID setup. Storage just keeps getting cheaper, and RAID is no longer requires rocket surgery to use. For USD 400.00 at my local Apple Store, I can get a terrabyte in box from Western Digital, usable as either RAID 0 (no protection, seen as one big drive) or RAID 1 (half the storage, but double the copies) via USB 2.0 and Firewire 400/800. For USD 500.00 I can buy a Drobo (be sure to watch the video), a USB 2.0 (sure wish it had Firewire 800) self managed RAID box that can sport 1 to 4 drives of any mix of sizes, and intelligently manage the RAID automatically. Of course, I’d still have to shell out for drive(s), but the beauty there is that you can only buy as much storage as you need (any brand SATA drive), allowing storage prices to continue to drop. Also, it’s a one-time investment; if you run out of room and free drive bays, you replace smaller drives with bigger drives, one at a time, as needed. Either of these solutions could be hung directly off my Mac for use with Time Machine, or attached to my Linux server or an NAS for a networked setup.

In the interim, I’ll settle for a nightly copy of the data I care about- In my case, that’s the /Users directory and everything in it. I have an old beige box in the basement running Linux as a headless server, among other uses it hosts 160GB of storage via Samba. Keeping my ~85GB of /Users data copied onto the 160GB drive should be good enough for now- if either machine goes, I’ll have one good copy.

To maintain my copy, I am using rsync (see also the Wikipedia entry). I recalled that there are some issues with rsync on OS X, including the fact the the HFS file system used by OS X supports resource forks, which most other file systems do not. A bit of the Google turned up this rsync on OS X Mini-HOWTO at lartmaker.nl that nicely sums up the issues and offers a patch for Apple’s version of rsync. I compiled a new rsync on both my Mac and the Linux server with no issues, and proceeded to make a full backup of /Users to the remote system. For the curious (and for my own future reference), the particular rsync incantation I used was:

sudo /usr/local/bin/rsync -aREx --delete /Users/ user@server:/var/local/sync/iMac/

Where of couse user and server are placeholders, and /var/local/sync/iMac/ is a directory I carved out to hold the backup. I already had SSH setup between the Mac and the server; any decent rsync guide can elaborate on that if you don’t know the dance. The next step is to automatically run this command nightly, which will require a little re-jiggering; my default SSH key is protected by passphrase. I found one way of automating rsync, but I plan to do a little more research. For the next day or three, I’ll run it manually before bedtime.

HOWTO Backup an Entire Windows Drive with OS X and Ubuntu

Note: This method backs up the entire drive, free space and all. If you have a 30G harddrive, you’ll need 30G free on the target Mac. If you only want to recove some of the files, check out my article HOWTO Recover Files from a Non-Bootable Windows PC using Ubuntu Live. In addition, that article doesn’t require the use of a Mac to retreive the files, you could use another Windows box, for example.

About a year ago, I posted a method for backing up a Windows Laptop with OS X, which used a Knoppix Live CD and NFS. Today, I needed to perform the same task. I wanted to use Windows file sharing instead of NFS, since support is built into OS X and can be enabled from System Preferences. I also wanted to use Ubuntu instead of Knoppix, since I had a Ubuntu 6.06 CD handy, and I’m a Ubuntu fan. While I had some issues, I came up with a method which I think is easier then the old one.

The laptop I needed to back up recently had a memory chip go south, so it only has 192M of memory, and I believe part of this is used for video memory. Unfortunately, this is below the recommended 256M minimum RAM for a Ubuntu Desktop install. I haven’t found a minimum requirement for running the live CD, but considering that no swap space is availble when running a Live CD, I expect it is at least the same. I found that running the Live CD on this machine was so slow as to be unusable.

Hoping to find a way to reduce memory requirements, I searched for a comprehensive guide to the Live CD’s boot options, with no success. I also searched for a way to boot the Live CD without X Windows (text mode only), also with no success. If anyone can help with either option, please leave a comment.

Stuck, I decided I’d have to download a different Live CD. Although other options exist, I decided this was a good opportunity to try XUbuntu , a Ubuntu variant that uses XFCE instead of Gnome for its desktop environment. XFCE is designed for machines with low resources, and the Live CD requires only 128M.

XUbuntu worked fine. I couldn’t find some of the GUI tools Ubuntu provides that I’ve used in past HOWTO’s, so this one is mostly command line. As a bonus, if you have a machine with a little extra RAM, and already have a Ubuntu Live CD handy, these instructions should work equally well.

  1. On the Mac that will receive the backup, make sure that Windows Sharing is enabled via the Sharing pane in System Preferences. By default, this will share your home directory; that’s where we’ll put the backup. In the instructions below, my user name is jclark, substitute your own. Also, make a note of your Mac’s IP address. If you don’t know it, open Terminal and run ifconfig.

  2. Boot the system to be backed up with a (X)Ubuntu Live CD (also called the Desktop CD in the latest release).

  3. Run the Terminal application. Depending on your *Buntu of choice, it will be in one of the menus.

  4. Install support for mounting Windows shares (will be installed in RAM only):

    sudo apt-get install smbfs
  5. Create a mount point (a local directory that will host your Mac home directory):

    cd /mnt
    sudo mkdir mac
  6. Mount your (Mac) home directory on the source machine. Change the “” to your Mac’s IP address, and change “jclark” to your Mac username (in both places):

    sudo mount -t cifs -o 'username=jclark' // /mnt/mac

    You will be prompted for a password, provide your Mac password. Note: using -t cifs instead of -t smbfs (as you may expect) avoids a 2GB file size limitation.

  7. Copy the hard drive to your Mac. This assumes your Windows hard drive is ‘hda1’, which it probably is. If you know it isn’t (and you know the correct value), change accordingly.

    sudo dd if=/dev/hda1 of=/mnt/mac/drive_backup.img
  8. Wait. This could take a while. Backing up my 30G drive took 6.5 hours. I expected it to be faster, maybe the laptop only supports 10MBit Ethernet. If you’ve got 100MBit ethernet, this should be faster. Oh, and if you are on a machine with a wireless connection, assuming it even works under Ubuntu, I reccomend using a direct (wired) ethernet connection for this if at all possible.

  9. When it finishes, unmount the shared drive:

    sudo umount /mnt/mac

    and shut down Ubuntu.

  10. On the Mac, in your home directory, you now have a disk image named drive_backup.img. Double click it to open it like any other drive image. You can copy files out as needed.