Howto Backup your Mac incrementally over SSH

March 10, 2006

Do you have access to a shell account on a unix server with some spare space? If so it's pretty easy to incrementally backup your files securely with SSH.

I titled this entry Howto Backup your Mac incrementally over SSH but this technique can also be used to backup any computer that can run rsync and ssh. They are already installed on Mac OS X, and most linux / unix servers.

Step 1 - Create a folder to store your backups on the remote server

mkdir backup

Make sure that your SSH user has permission to write to this directory.

Step 2 - Setup automatic authentication Optional

This step allows the backups to run without prompting you for a password when it runs. You can omit this step but you will have to type in your ssh password each you run backup.

I wrote an article called Setting up public key authentication over SSH that will guide you through this step.

If you own the server you might also want to create a user specifically for this process.

Step 3 - Use rsync to backup files incrementally

rsync -e "ssh" -rca --delete-after ~/test/

Now lets break it down a bit:

  • rsync - this syncs the local directory to with the server directory.
  • -e "ssh" - this tells rsync to use ssh if your want to pass in other ssh options such as port you can do that in the quotes: -e "ssh -p 12345"
  • -rca recursive, checksum, and archive
  • --delete-after - this will delete files on the server if you delete them locally.
  • ~/test/ - I am backing up / syncing the test directory inside my home directory on my mac.
  • - my ssh username is pete, my remote ssh server hostname is, and I am backing up into the directory ~pete/backup.

Excluding directories

Sometimes you might want to exclude a directory from being backed up, perhaps your Music directory since that is already backed up on your ipod.

rsync -e "ssh" -rca --delete-after --exclude=Music --delete-excluded ~/test/

Step 4 - Schedule it with cron Optional

Now lets create a cron job (scheduled task) to run this script every day. First make a new file called in your home directory.

rsync -e "ssh" -rca --delete-after ~/test/

Now sure make the file is executable: chmod ug+x

Next type crontab -e this will open up your scheduled tasks list, in most cases it will open an empty file. Add the following to the file:

15 11 * * * yourusername ~yourusername/

That's will schedule the script to run at 11:15am. If you don't want to do this step you can simply run the command whenever you want to run a backup.

DISCLAIMER You should test everything before performing the backups on your live data. If you would like to use the scripts above you must acknowledge that I'm not responsible for any loss of data.

I'd love to hear your feedback on this, please comment below. I'm use this technique to backup my two web servers on to each other.

Related Entries

29 people found this page useful, what do you think?


Pete, You may be interested in the FreeNAS project at This OS can turn an old computer into a File Server with everything you have described here.
Thanks Pete, I have it working like a champ. A dedicated server of mine is backing itself up to a shared server account I have. They both have fast connections so it's going really quickly.
I tried this approach, but...the resource forks are not preserved. Of course, if your data doesn't require this, it's a great solution.
be careful. this will only backup the main stream of each file. applications for example, that you would drop in folders as part of the backup wouldn't actually get backedup properly (they would show up as folder, loosing all ability to run...) this is good to backup datafiles (that are single-streamed), but not all files are. make sure that the copies of the files you have are readable on the backup media...
I didnt quite understand the last comment if someone can put in little more layman terms i would appreciate.
Beware "--delete-after". I run two scheduled cron jobs, one S,M,Tu,W,Th,F without the --delete-after option, then a separate Saturday job with the --delete-after option. If you run this automatically always using the --delete-after switch, you will find that the file you accidentally deleted yesterday was faithfully removed from your backup copy as well.
try using backup software that supports it like
Hi I want to recommend you very useful rapidshare search You can find there a lot of new movies, games and music. Enjoy it!
It very important. It almost started to work. has incredible speed of searching rapidshare links in the internet. database includes all rapidshare links.
Here's a script I wrote for backups years ago. Change the following variables: - RSYNCUSER=unix username on remote host - EMAIL=your email address - BACKUPSERVER=your remote hostname - KEY=path to your ssh key (passwordless login) #!/bin/bash LABEL="BACKUP-`date +%Y%m%d`" SUMMARY="/var/tmp/summary" TIME=`date +%k:%M:%S` BACKUPDATE=`date "+%A %d/%m/%Y"` EMAIL=your@email.address RSYNCUSER=your_rsync_username BACKUPSERVER=backup.server KEY=/root/keys/backup.server.key check_server() { ssh -i $KEY $RSYNCUSER@$BACKUPSERVER exit >> /dev/null 2>&1 if [ $? -ne 0 ]; then printf "Backup did not run. The backup server was unavailable.\n" >> $SUMMARY cat $SUMMARY | mail -s "Backup summary for $BACKUPDATE" $EMAIL exit 1 fi } run_backup() { printf "Filesystem backup performed on `date +%m/%d/%Y`.\n\n" >> $SUMMARY printf "\nSession summary:\n" >> $SUMMARY printf "\n\t\t`date +%k:%M:%S`\t- Session started\n" >> $SUMMARY printf "\n\t\t`date +%k:%M:%S`\t- Processing backup\n" >> $SUMMARY rsync -ao --stats -e "ssh -i $KEY" / $RSYNCUSER@$BACKUPSERVER:/ --exclude='/proc*' --exclude='/dev*' > /var/log/backup.log 2>&1 if [ $? -eq 0 ]; then printf "\n\t\t`date +%k:%M:%S`\t- Session completed\n\n" >> $SUMMARY else printf "\n\t\t`date +%k:%M:%S`\t- Session aborted\n\n" >> $SUMMARY fi } check_server run_backup cat $SUMMARY | mail -s "Backup summary for $BACKUPDATE" $EMAIL exit 0
This will work fine for resource forks and other metadata and extended attributes if you have at least 10.4 and add the "-E" switch to the rsync command.
Nice posting. Backing up the data is best option to avoid data loss. But sometime after precaution we face the problem of data loss which leads a frustrated situation. In this scenario we can use stellar phoenix data recovery mac software which easily recover data from mac partition. Get it from here Thanks
A concise and very useful post. I could backup a damaged-Windows disk using a Debian live CD via rsync. Thanks.
I would like you to recomend to look for files you are interested in. Now it's the best file hosting search engine in the web. More than 4 000 000 files indexed. Try it and i think you will be satisfied with search results. Use search string.
Thanks for the howto. I'm now backing up my MacBook and iMac to my Ubuntu file server. You saved me $500 on a Time Capsule.
If you would like keep the backup data (both transmission and storage) of your data encrypted, then you may be interested LBackup.
I tried doing it this way but I have SO MANY music/video/photo files that I modified that it took forever. I had to implement this strategy:
I use the same method for backing up my Linux boxes. Didn't even think it could apply to my new Mac!

Just a few notes: the -r flag is redundant as -a implies -r

Also I'd add a note about the -E flag, that is kind of important for Mac files, thanks iceman for pointing that out!

Also you can run rsync directly from cron, no need to create a shell script, that is also a little redundant..
There are a lot of spam comments in here, can someone clean those up?

Post a Comment


Spell Checker by Foundeo

Recent Entries


did you hack my cf?