Automagic backups?


#1

I have a customer who has a website she updates using front page, and I had front page extensions installed for that domain. (not my choice about front page extensions, I hate them) But anyways, front page extension are installed and therefore DH doesn’t make it’s lovely automatic backups any more, nor can you access the files via FTP or a shell account (becuase of ownership stuff according to the KB). So I’ve been working on setting up a system where I keep a local back up all my website files and MYSQL databases localy through an automagic cron job. I was wondering if anybody knew of a way to do somethign similar to this for this domain? I’d prefer to have some easier way then loading into gasp front page on my own computer and importing the web site.

Thanks in advance for your sigguestions!

-Matttail


#2

Well, it took me a bit to get this one figured out, but I finally got an idea a couple nights ago, and now have a working cron job to make backups of the front page domain. I have cron set to run this shell script once a week:

#!/bin/bash
suffix=$(date +%y%m%d)
mkdir /home/user/joys
cd /home/user/joys
wget --mirror --non-verbose --domains=www.myDomain.com www.myDomain.com
tar -cf /home/user/backups/archives/joys.$suffix.tar www.myDomain.com/
rm -r /home/user/joys


Just copy and paste that into a text file. Save it as fp.sh
using the wget utility in mirror mode which basically pulls down the index file of the site, then gets all of the images and other stuff it links to, then follows the links and pulls those pages and images. It should keep directory structure as well. the --domains= part tells it to stick with my domain and not follow links to other sites I don’t want backed up.

–non-verbose is rather noisy though. It writes a single line for every file it downloads. And that will get E-mailed to you by cron. Not the best, hmm? you can change that to --quiet and that will make wget shut up entirely. The only problem, and the reason I’m using --non-verbose, is that quiet turns off errors too.

The script then creates a tar file using the current date in the file name. That’s the suffix command, and it appears as name.YearMonthDay.tar

Lastly I clear out the files after the tar is created so every thing’s nice and clean for next week.

use “crontab -e” from ssh to edit your cron tab. Just put in a line that looks something like:
30 2 * * 4 /home/user/backups/fp.sh


“That’s great”, you say, “but I don’t want to have to login with a ftp program every week and download the archive”. I solved that one too, because I’m lazy like that.

I’m running a windows machine at home, and instead of using cron to schedule things to be run at certain times, you use task scheduler. Here comes the fun part, I wrote a simple ms-Dos batch file to accomplish what I want. The batch file is actually very simple. Just open up note pad, and put this in there:


ftp -i -s:ftp.txt
pause

save that as download.bat (whatever really as long as it ends in .bat) select the save as type to be “all files” so you don’t get the txt extension. This batch file just opens up a dos box and runs Microsoft’s command line ftp program. the “-i” option just turns off interactive mode, so it won’t prompt you if you want to download the file. “-s” tells ftp to use that file for the commands to run. The “pause” command simply tells it to leave the dos-box open after FTP completes until you press any key. When downloading the files ftp will tell you how big the file was, how long it took to download it, and based upon those two figures, the average data transfer rate. If you don’t want that, just remove the entire command.

For the ftp commands, Use notepad again, and pop this into a new file.


open domain.com
user
password
cd backups/
binary
mget archives/
mdelete archives/
quit

Save that as ftp.txt (or whatever you want, just remember to change the batch file to reflect the proper name). This part is pretty simple, you just establish the connection to your ftp account - and provide the user/pass. Next my command moves into the directory below the one containing the archive so I can download the entire directory, not just one file. I’ve got cron running other scripts that put archives from different sites in there too.

Anyways, next the script downloads all files in the archives directory saving them to the current working directory on you local machine (wherever the batch script is run from). Once the download is finished it clears out the directory so next week it only downloads the new archive. Finally the quit command is passed and you log out of the ftp server.

Once you’ve got these files saved, just click Start > Control Panel > Scheduled Tasks > New. Just set it up to run your batch file a bit after cron finishes making the archive, and when you wake up in the morning a black box will be waiting for you to let you know that everything went OK (or not!).

-Matttail