Is there a more efficient way to backup a large folder?


#1

I have a WordPress site. Currently I’m rebuilding my site, but when I’m done, I am planning to do monthly backups of the wp-content folder (the databases are backed up every week).

Currently, my way of backing up is to download the entire folder with FTP.

As far as I understand, FTP has a rather high overhead and I would rather not get an email from the DreamHost people saying I’m using too much bandwidth (or slow down all the other sites on my shared server as that’s not very nice).

As well, I tend to leave my computer on overnight so I don’t need to download very fast. I could put a cap on the FTP, but as I said, the overhead.

Do you people have a suggestion for a more efficient way to do the backups? Note that I prefer to compress them myself (I use 7z on ultra. I don’t need the files on-the-fly). As well, I exclude a couple of folders.

Thanks.


#2

How big are these files?


#3

The largest one is around ~75 MB (it’s a 7z file of large bitmap files) but most of them are under 5. Lots of images and PDF files. (Don’t worry, I use thumbnails.)


#4

there are a few scripts out there that will backup your site to your backups user account. that works well for me.

other than that, rsync will backup only the files that have changed, and just the part that has changed as well. there are many guis available for it as well. i’ve tried lucky backup (for linux) which seems to work well.