What next?


#1

From the wiki, Dreamhost have said Domain Snapshots are dying and will be discontinued.

They will still have the domain restore (which restores from 2 hourly, 2 daily and something else…) much like the snap shots.

So, can’t the Automatic Backup scripts seen in the wiki not be updated with the new locations of where Dreamhost will store these backups after removing the snapshots directory?

How else can we make tar backups of our domains and MySQL databases with automatic FTP downloading?


#2

To backup a domain folder:

tar -czf backup.tgz domain.tld

To backup a database:

mysqldump --opt -uusername -ppassword -h MySQLHostname dbname > backup.sql

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost


#3

Oh wise one, cheers.

So, could I not just use that same script (modifying the directory);

#!/bin/bash
suffix=$(date +%y%m%d)
domains=( “domain1.com” “domain2.com” “domain3.com” )
cd /home/username/
for domain in ${domains[@]}
do
tar -cf /home/username/backups/archives/${domain}.$suffix.tar ${domain}/
done

There I just took out the /.snapshot/nightly.0/ line? Wouldn’t that work or would it?

I guess if it did, then I wouldn’t need to do any other changes to the instructions on the wiki.

Finally, let’s call this a challenge since it gives you a chance to show off your expert knowledge…

For the Linux instructions under the section for automatic FTP backup of the backup tar made (I’m running Mac by the way which I know isn’t Linux);

There doesn’t seem to be 2 scripts shown. It gives you the bash script which the wiki says to just paste into Terminal (but how does my system know to always connect with those details?) and then it gives the FTP script. Do both sections of that script go into the ftp.sh script file?

Finally, also under the Linux area but where it then mentions to edit the crontab (I know how to do that via Terminal on Mac) just confirm that I replace /home/username/backups/ftp.sh with ~/username/backups/ftp.sh since I am on Mac?

Thank goodness for people like yourself, silly old me would be nowhere otherwise.


#4

I believe it should.

I’m not really familiar with the Wiki :frowning:

I’d be inclined to have the backups stored in a specific folder on the server and use a script/task running on your Mac that simply ftp’s or rsynch’s the backup directory onto your local machine at a specified time (daily, weekly, etc.)

Scott is a Mac God and I’m sure he’ll be able to instruct you on the best approach :wink:

Maximum Cash Discount on any plan with MAXCASH

How To Install PHP.INI / ionCube on DreamHost


#5

Now being really basic, how do I get in touch with Scott?

I looked at the username incase you meant that, but I am not 100% sure you do.

I think changing the directory of where it makes a tar from would just work, but I still need ‘Scott’s’ assistance on the bash script.

Cheers!


#6

There’s not much point to backing up within your own account. Local copies are much more reliable and useful. This is the backup script that I run on my Mac. I have a similar script for every domain I have at DreamHost, and have a master script that calls each of the domain backup scripts.

#!/bin/bash

These two lines dump my two databases into my home directory

/usr/bin/ssh ME@MYDOMAIN.COM ‘/usr/bin/mysqldump --all-databases -hDB.MYDOMAIN.COM -uUSER1 -pPASS1 > DB1.sql’
/usr/bin/ssh ME@MYDOMAIN.COM ‘/usr/bin/mysqldump --all-databases -hDB.MYDOMAIN.COM -uUSER2 -pPASS2 > DB2.sql’

This command updates the mirror (including my DB dumps) in my homedir’s Sites folder on my home machine

rsync -avze ssh --delete ‘ME@MYDOMAIN.COM:.’ ‘/Users/MYMACUSER/Sites/MYDOMAIN’

Delete my database dumps from my home directory since I only needed them for rsync

/usr/bin/ssh ME@MYDOMAIN.COM /bin/rm *.sql

-Scott

p.s. Since I run Time Machine at home, I have somewhat the same functionality as snapshots. If you really want timestamped tarfiles, after you run the above script, tar locally (on your home machine) from your just-updated local mirror to a timestamped tarfile.


#7

Okay, sounds interesting, but here is where I get you typing more.

If for example, some hapless person (myself) decided to follow the wiki (which I have) how can I adapt your script to just download the files?

I’ve got everything in /home/USERNAME/Backups/ on Dreamhost.

I assume if I places the bash script (the one you’re hopefully going to write briefly) in a folder in my Mac’s home directory say ~/jayson/DHBackups/ then the bash script would download the TARs into there since I wouldn’t give it a specific directory?

Basically, what I am begging you for is just to make my life really simple and to write the script that uses SSH (I’ve already completed the directions to passwordless login via SSH) to go to that directory on my domain and download them to that directory on my Mac.

I would consider your script you just explained but I’ve practically setup the automatic backup script given in the Dreamhost Wiki, it’s just this I need help with.

I’ll then use iCal to run the script weekly or if I can be bothered, add it as a cronjob via Terminal.app.

Really appreciate if you can do that.

If you can, I’ll get around to updating the Wiki too, just to make it really clear to those on Mac.

Cheers from England.

PS: I understand that copies store in DH are useless since the idea is to backup offline from the server, but this seems practically similar to me, only that I am not mirroring my site and running it locally for testing in my Sites folder, I actually prefer archived TARs of the site. I can store them for months like an OCD obsessed data person thingy.


#8

#!/bin/bash
/usr/bin/rsync -avze ssh --delete ‘ME@MYDOMAIN.COM:/home/USERNAME/Backups’ ‘~/DHBackups’

This command will make DHBackups look just like Backups. The --delete command makes your Mac a true mirror by deleting a file if it’s no longer on the server. Get rid of --delete if you want a true archive.

I’d be sure to keep the destination directory in the rsync command, otherwise you may end up mirroring your entire Backups to some unwanted folder. Your ~/jayson destination directory will most likely create a /Users/jayson/jayson directory if your username is jayson. ~ means your home directory, and there’s no need to specify your username. To be safe, you can avoid the ~ confusion by specifying the full path of /Users/jayson/DHBackups instead.

-Scott


#9

Right, got myself into a mess with this.

I was looking at that script (since I’ve decided I no longer need tars of everything and just having .sql backups of the databases and a mirror of my site is enough. However, if I rsync my home directory, I’ll get all sorts I do not need, such as

So, since I am going to use your method after all, AND I shall be dumping my databases (each have a different username so only 1 database and 1 username link together), I just would appreciate you checking the sample below. I’m trying to dump the database sql files into ‘Databases’ directory within my home/username/ directory on DH.

#!/bin/bash
/usr/bin/ssh ME@MYDOMAIN.COM ‘/usr/bin/mysqldump --all-databases -hmysql.mydomain.com -uxxxxxx -pxxxxxxxx > /Databases/DB1.sql’
/usr/bin/ssh ME@MYDOMAIN.COM '/usr/bin/mysqldump --all-databases -hmysql.mydomain.com -uUSER2 -pPASS2 > /Databases/DB2.sql’
rsync -avze ssh ‘ME@MYDOMAIN.COM:/home/USERNAME/domain1.com’ '/Users/jayson/Backup’
rsync -avze ssh ‘ME@MYDOMAIN.COM:/home/USERNAME/Databases’ '/Users/jayson/Desktop/BackMeUpSafelySoon’
rsync -avze ssh ‘ME@MYDOMAIN.COM:/home/USERNAME/domain2.com’ '/Users/jayson/Backup’
rsync -avze ssh ‘ME@MYDOMAIN.COM:/home/USERNAME/domain3.com’ ‘/Users/jayson/Backup’
/usr/bin/ssh ME@MYDOMAIN.COM /bin/rm /Databases/*.sql

I’ve put xxxx where the username and passwords will go, leaving the leading -u and -p and the same for hostname, and since there’s only 1 database for each user (although maybe 3 databases under the same hostname) I can leave --all-databases since that will still keep each database in a separate .sql file.

The first two lines, since I will be running the above script on my Mac, I don’t need to give it directories? I mean, because when using SSH, the default directory is the home directory, the script dumps the .sql files into the home directory and that’s where there’s no directory paths set right?

Now, it seems I am SSH-ing for each command, is this right or can I change the script in some way that does all of the above and then that’s it done and dusted? As long as I get my files I don’t mind. I’m just concerned about my slashes etc… and making sure it all looks good to go. Where I’ve added directories of where to dump the database data etc…

I use Time Machine as you do too, so I now don’t need to datestamp my .sql files with Automator or anything (although I use FileVault so I can’t use the graphical fly motion finding of my files anymore).

Thanks for your help. If you could just kindly check that script, or anyone for that matter, I can then actually save it script.sh and run it each week.


#10

Don’t write to /Databases/… Make it /home/USERNAME/Databases/…
And fix the last line to remove /home/USERNAME/Database/*.sql

The -u and -p for DB1 and DB2 are usually different, unless you have all tables for both “databases” in the same master database (a generally bad idea). The --all-databases will grab everything for USER1 only. So you have to run the second/third/etc. line for USER2, USER3, etc…

The first two lines, the sqldump, run on the server and dump into the Databases directory in your DH account. Just do my first and second fix listed above.

Each command does need its own SSH. You’re making several unique requests to the server. You have to run most of this from your home machine. If you split the task, then you need a script at both ends which is a real headache to coordinate.

-Scott