Automatic backup software for mac

Teecee

Gawd
Joined
May 31, 2005
Messages
948
I need a tool that will automatically backup data on a mac. I will probably have two macs some how connected to an external hdd. I would like to backup the files on the Mac to the external drive at a certain time everyday. Then I would also like to setup some sort of online automatic backup system. Something that stores the files offsite. I have looked at Mozy and Amazon S3 w/ JungleDisk. Does anyone have any experience in this that could make some suggestions? Thank you very much.
 
i guess you are running 10.4, otherwise youd take a look at time machine.

you could use super duper and crashplan together. i hear that works well.

as for off site, i use my dreamhost web hosting. its great, i havent ever had any problems with it, and mine is only costing me 120 bucks a year with 270 GB of storage space and 5527 GB of bandwidth per month. im barely even scratching that bandwidth. both storage and bandwidth grow every week. and right now, they are running a signup special that is 500GB storage 5TB bandwidth and only 6 bucks a month depending on prepayment. take a look at www.dreamhost.com
 
offsite I used Amazon S3 and JungleDisk for a while. JungleDisk updates once an hour, if anything is new in there (you can specify folders to include like documents, music ect...) it will upload it to the server. The 1st upload was a pain in the ass but it works out alright.

I stopped using it because I had no real need for it but it worked great and was pretty cheap.
 
The dreamhost solution, did you do some script to run at a certain time that would connect to your dreamhost ftp and upload certain files and folders? Remember I need hands off automatic.

Thanks for the info about the JungleDisk.
 
i do mine manually, only backing up certain folders. and through transmit. yes, it could be automated, but i dont need it to be.

that is what crashplan does. it keeps a lookout for changes and keeps your backup updated. i think you can set it up for any server, i know they added s3 support, but all i have is dreamhost, and that is all i need.
 
The dreamhost solution, did you do some script to run at a certain time that would connect to your dreamhost ftp and upload certain files and folders? Remember I need hands off automatic.

Thanks for the info about the JungleDisk.

I use Dreamhost as well. Dreamhost allows SSH which means you can rsync your files over. I have mine setup in a cron job that runs each night around midnight that rsyncs my files over. This is 100% hands off. Like mentioned before, the initial upload was painful but after that it only takes a few minutes, depending on the new/changed files and your connection. This is also completely transparent, even if you are on the machine you don't know it's running because it runs in the background. The only ay you can tell is if you look at the current processes or if your connection slows down enough to notice :D
 
I use Dreamhost as well. Dreamhost allows SSH which means you can rsync your files over. I have mine setup in a cron job that runs each night around midnight that rsyncs my files over. This is 100% hands off. Like mentioned before, the initial upload was painful but after that it only takes a few minutes, depending on the new/changed files and your connection. This is also completely transparent, even if you are on the machine you don't know it's running because it runs in the background. The only ay you can tell is if you look at the current processes or if your connection slows down enough to notice :D
hrmmm.... maybe i should talk to you and take a look at this "hands off" magic that you speak of. i have been telling myself that i need to "get serious" about backing up my data, but i always turn out to be too lazy to do anything other than dump a few folders on my ftp. got any more details you can share?
 
hrmmm.... maybe i should talk to you and take a look at this "hands off" magic that you speak of. i have been telling myself that i need to "get serious" about backing up my data, but i always turn out to be too lazy to do anything other than dump a few folders on my ftp. got any more details you can share?

Ok, I've had several people PM me so I'll update this thread with a quick howto. It's extremely simple if you know the command line at all. If there's enough interest I could probably put together a quick script that would set it all up for you.

First off, you need to know the rsync command, it's simple.

rsync -aruv --ignore-errors /Users/ [email protected]:source.com/Backups/ >>/Users/youruser/Documents/rsync-info.txt

A quick explanation of what this is doing (you can type rsync -h or man rsync at a command prompt to get this info as well). -a is the archive flag which will (or should) preserve date stamps, Ownership, Groups, and permissions. All of these are important for seamless recoveries. -r is the flag for recursive, it will allow it to recursively traverse all the directories form the point you tell it to start. -u is the update flag. In reality rsync should do this by default but I have always found it 'makes' it only update files that have been changed and not retransfer untouched items. --ignore-errors does just that, it make it so the process doesn't stop if it hits something it can't transfer (sometimes permissions or filenames can choke the process). /Users/ is telling it to backup the Users folder and everything below it. If you only want your user items then you could pass /Users/youruser/ or ~/ will do the same thing. [email protected]:source.com/Backups/ is telling it to transfer your items using the 'user' user, the source.com is what ever your ssh to. So if you own imreallycoolipromise.com and you ssh to it by using [email protected] then you'd use this same info here. Then (atleast for Dreamhost, each host will vary here) the part after the colon is the path in which you want it to follow to put your files. So if you want your files in a directory under your domain called Backups, on Dreamhost, you'd use that path. The reason is, when you ssh (or FTP) in to your Dreamhost account it takes you to your user directory and from there you go to your different domain directories. The last part >>/Users/youruser/Documents/rsync-info.txt creates a .txt file in your Documents folder with all the info from the rsync session. having the >> instead of just > will append to the end of the file instead of overwriting it each time.

I hope this is making a little sense so far. The next step, after you have your command is to setup your ssh keys so you have passwordless login. See the link here for how to do it. This makes it so your computer can ssh in (or rsync) and not get hung up while waiting for a password.

You're almost finished :D Now you need to setup your cron job. You can read the wiki page here for a good explanation of cron. Basically it's a unix way of scheduling things to happen :p To setup your cron job, really the easiest way is to download a program such as Cronnix. This will give you a gui to schedule your cron job. I do highly suggest atleast reading the wiki page to get a small understanding of the scheduling and creation of cron jobs. Now modify the above rsync command and paste it in as the command to execute. Now schedule in cronnix when you want it to run. You probably want to schedule this to run as the root user to avoid permission problems if you are doing more than just your users stuff.

If you'd like you can paste the above rsync command on a command line with your variables and see if it runs. It should! I know this is a very high level overview so if you have questions let me know. The was also all off the top of my head so I'm sure I may have missed somethng. I'll see if I can put together an automater script or maybe a shell script to do all these same things.
 
Great tutorial. I am going to check this out and get back with you. Thanks for your help.
 
Quick question, will a cron job wake a mac out of sleep mode to run a script?
 
Back
Top