Backup Scripts for Dreamobjects

dreamobjects

#1

Hello Everybody!

I’m looking for a how-to or script that allows me to backup my MySQL databases and my home directory (/home/user/) daily.

I tried following these instructions, although I’m not getting it to work for me at all…lost in the sauce :slight_smile:

If you know of where I can find an up-to-date guide, how-to, tutorial, or if you have a script that will dump my database to a folder and then backup everything via tar or whatever to DreamObjects, you would be awesome!

I do have a bash script I’m using currently using that uses FTP to upload the tar files to another server I own. If you could help modify that to allow me to upload to DreamObjects instead that would be even better since I understand how that script works and all.


#2

hi, you can use the mysqldump command to dump your MySQL DB to a file, documentation for that tool can be found at http://dev.mysql.com/doc/refman/5.7/en/mysqldump.html . Then once you dump your DB to a file, you can upload it to DreamObjects using AWS CLI, our documentation for that is found at https://help.dreamhost.com/hc/en-us/articles/216335908-How-to-use-AWS-CLI-with-DreamObjects . I would recommend writing a small shell script that dumps your DB, tars your home directory and then uploads the home directory and the DB to DreamObjects using AWS CLI. You can then put that script in a Cron (so it runs at a scheduled interval).


#3

Hi

You can use the mysqldump tool to dump your databases to a file, documentation for that is available at http://dev.mysql.com/doc/refman/5.7/en/mysqldump.html .
Since you already have a script that tars your home directory and uses FTP to copy it to another server, you can modify that script to use AWS CLI instead of FTP and send your data to DreamObjects, our documentation for that is available at https://help.dreamhost.com/hc/en-us/articles/216335908-How-to-use-AWS-CLI-with-DreamObjects .
Let me know if you have any more questions.

Caleb


#4

Thanks for the reply Caleb!

I attempted to follow the instructions as presented and got everything installed and configured the way I understand anyway.

I tried to upload a directory and got and error. Here is what I attempted:
aws --endpoint-url https://objects-us-west-1.dream.io s3 cp /home/user/directory/ s3://bucket/directory

The error I got is:
upload failed: directory/ to s3://bucket/directory [Error 21] Is a directory: u’/home/user/directory/’

I thought it might be best to show you my script. If you can help me out with it I would be greatly appreciative! For some reason I just don’t get the guides, I need something from step 1 through end without having to jump around. I know that is hard to request since it would also duplicate information and all.

I got my script from this site:

Here is my script:

#!/bin/bash

System Setup

DIRS="/home/user/directory"
BACKUP=/home/user/mytmp/backup.$$
NOW=$(date +"%d-%m-%Y")
DAY=$(date +"%a")
FULLBACKUP=“True”

MySQL Setup

MUSER="username"
MPASS=“password"
MHOST=“host"
MYSQL=”$(which mysql)“
MYSQLDUMP=”$(which mysqldump)“
GZIP=”$(which gzip)”

FTP server Setup

FTPD="bk/incremental"
FTPU="username"
FTPP=“password"
FTPS=“domain"
NCFTP=”$(which ncftpput)”

Other stuff

EMAILID=“email”

Start Backup for file system

[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

Force full backup

if [ “$FULLBACKUP” == “True” ]; then
FTPD="bk/full"
FILE="fs-domain-$NOW.tar.gz"
tar -cvpzf $BACKUP/$FILE $DIRS
else
FTPD="bk/full"
FILE="fs-domain-$NOW.tar.gz"
tar -cvpzf $BACKUP/$FILE $DIRS
fi

Start MySQL Backup

Get all databases name

DBS="$($MYSQL -u $MUSER -h $MHOST -p$MPASS -Bse ‘show databases’)"
for db in $DBS
do
FILE=$BACKUP/mysql-$db.$NOW.sql.gz
$MYSQLDUMP -u $MUSER -h $MHOST -p$MPASS $db | $GZIP -9 > $FILE
done

Dump backup using FTP

#Start FTP backup using ncftp
ncftp -u"$FTPU" -p"$FTPP" $FTPS<<EOF
mkdir $FTPD
mkdir $FTPD/$NOW
cd $FTPD/$NOW
lcd $BACKUP
mput *
quit
EOF

Remove tmp files

rm -r /home/user/mytmp/*


#5

Ok I have made it a little bit further, although I had to switch to boto-rsync since I could not figure out awscli and I have another problem now.

I attempted to run this command to test boto-rsync:

boto-rsync -a ACCESSKEY -s SECRETKEY --endpoint objects-us-west-1.dream.io /home/user/directoryiwanttobackup/ s3:/mybucket

When I run the command I get the -h read out. Any ideas?

Also I think I know how to modify my script above to use the above line once I get it working. Just a note though, could not get my .boto file to work. I used the example from the how to linked above:

[Credentials]
aws_access_key_id = MYKEY
aws_secret_access_key = MYSECRETKEY

The boto-rsync command didn’t even recognize the .boto file (located in my /home/user/ directory).

EDIT

Alright, found my error: missed the “//” after s3:, I had only one. :slight_smile:

I do have another problem, I get this error message:
Yikes! One of your processes (boto-rsync, pid 13209) was just killed for excessive resource usage.
Please contact DreamHost Support for details.

I think it has to do with trying to upload a large amount of data. If I read correctly I could set the data rate, although what would be a good data rate for the upload?


#6

DreamHost Support Team was able to help me out further beyond expectations! This is the whole reason I LOVE DreamHost!

Stay Awesome DreamHost!

For any that are interested in how I am doing my backups…or if anyone has a better suggestion…here is my entire script:

#!/bin/bash

System Setup

DIRS="/home/user/directory /home/user/directory2"
BACKUP=/home/user/mybktmp/backup.$$
NOW=$(date +"%d-%m-%Y")
DAY=$(date +"%a")
FULLBACKUP=“True”

MySQL 1 Setup

MUSER1="username"
MPASS1=“password"
MHOST1=“hostname"
MYSQL1=”$(which mysql)“
MYSQLDUMP1=”$(which mysqldump)“
GZIP1=”$(which gzip)”

MySQL 2 Setup - If you use multiple hostnames for you MySQL Databases, delete otherwise

MUSER2="username"
MPASS2=“password"
MHOST2=“hostname"
MYSQL2=”$(which mysql)“
MYSQLDUMP2=”$(which mysqldump)“
GZIP2=”$(which gzip)”

Other stuff

EMAILID=“yourEMAILaddress”

Start Backup for file system

[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

Force full backup

if [ “$FULLBACKUP” == “True” ]; then
FILE="fs-site-$NOW.tar.gz"
tar -cvpzf $BACKUP/$FILE $DIRS
fi

Start MySQL 1 Backup

Get all databases name

DBS1="$($MYSQL1 -u $MUSER1 -h $MHOST1 -p$MPASS1 -Bse ‘show databases’)"
for db in $DBS1
do
FILE=$BACKUP/mysql-$db.$NOW.sql.gz
$MYSQLDUMP1 -u $MUSER1 -h $MHOST1 -p$MPASS1 $db | $GZIP1 -9 > $FILE
done

Start MySQL 2 Backup - If you use multiple hostnames for you MySQL Databases, delete otherwise

Get all databases name
DBS2="$($MYSQL2 -u $MUSER2 -h $MHOST2 -p$MPASS2 -Bse ‘show databases’)"
for db in $DBS2
do
FILE=$BACKUP/mysql-$db.$NOW.sql.gz
$MYSQLDUMP2 -u $MUSER2 -h $MHOST2 -p$MPASS2 $db | $GZIP2 -9 > $FILE
done

Conduct boto-rsync

boto-rsync -a ACCESSKEY -s SECRETKEY --endpoint objects-us-west-1.dream.io --chunk-size=10 --workers=0 /home/user/mybktmp/ s3://[bucketname]

Remove tmp files

rm -r /home/user/mybktmp/*


#7

Wonderful news @legacy! If you want to get your next month of cloud usage free (DreamObjects is part of the cloud) you can write a simple tutorial on how to use DreamObjects to backup your DreamHost site. You’ve basically done most of the hard work, all you need is to clean up the instructions, like if you need to install boto-rsync or how to configure it, or maybe how to add the script to a cron job and, most importantly, how to restore the backups.

All the details on the documentation bounty program are on https://github.com/dreamhost/dreamcloud-docs/blob/master/CONTRIBUTING.rst. You don’t need to submit a pull request: you can also just write up more details here on the forum.

Cheers


#8

Will do, thanks for the info!

I will write out a full guide here for getting started, setting up the cronjob, and what I do to restore from my backups once I can get back to my computer. I will say that I have some experience restoring from these backups in regards to my site files and my MYSQL databases and the process has not failed me yet… :slight_smile:


#9

I am trying to use the script as a template for my own and have a few questions:

  1. I assume home/user/directory means home/myusername/“actual directory I want to back up” Is that correct?

  2. Can I have three or more directories?

  3. Not sure what to put in for any of these:
    MYSQL1="$(which mysql)“
    MYSQLDUMP1=”$(which mysqldump)“
    GZIP1=”$(which gzip)"

  4. I assume I needed to create a directory on my server at the location of /home/user/mybktmp/ but not change the backup.$$ - correct?

  5. Other than changing my email address to replace yourEMAILaddress I assume I don’t change anything else until I get to boto-rsync -a ACCESSKEY -s SECRETKEY… Correct?

  6. Can I create variables by creating lines near the top with
    ACCESSKEY="“
    SECRETKEY=”"
    So I remember what they are? Then I change the line to:
    boto-rsync -a $ACCESSKEY -s $SECRETKEY… Or will that break it? I am pretty sure what to do with the rest of the line assuming the answer to my question 1 is that I got it right :wink:

Sorry I am so inexperienced at this and am asking beginner questions. I appreciate the help.


#10

No problem at all with the questions, I had to sideline this project and forgot about it.

Your questions:

Q1. Yes, /home/yourusername/whatever
Q2. Yes, just add a space between the directories. Example:
"/home/yourusername/dir1 /home/yourusername/dir2"
Q3. No edit needed, see the new script and instructions below.
Q4. Yes, I use two directories: mybk (I store the scripts here) and bktmp (this is where it stores the files for transfer)
Q5. In the new script I removed the email address since you can do that with the cronjob anyway.
Q6. You can, I just didn’t. I may give it a try since I use this script for multiple domains.

I have updated the script some as well. Here is the new script I’m using. I apologize ahead of time I’m doing this on my mobile, so if you get any errors, let me know. I’ll fix it tonight. Be sure to follow dreamhost’s guide on how to setup s3cmd. https://help.dreamhost.com/hc/en-us/articles/215916627-How-to-use-S3cmd-with-DreamObjects

Save the script below in a file (bk.sh). Change the [PUT…] with your information. Be sure not to keep the [ ].

bk.sh
----------------------start don’t include this line----------

#!/bin/bash

System Setup

DIRS="/home/[PUTYOURUSERNAMEHERE]/[PUTYOURDIRECTORYHERE]" #ex. /home/user/dir
BACKUP=/home/[PUTYOURUSERNAMEHERE]/bktmp/backup
NOW=$(date +"%d-%m-%Y")
DAY=$(date +"%a") FULLBACKUP=“True”

MySQL 1 Setup

MUSER1="[PUTYOURMYSQLUSERNAMEHERE]“
MPASS1=”[PUTYOURMYSQLUSERPASSWORDHERE]“
MHOST1=”[PUTYOURMYSQLHOSTNAMEHERE]“
MYSQL1=”$(which mysql)“
MYSQLDUMP1=”$(which mysqldump)“
GZIP1=”$(which gzip)"

Start Backup for file system

[ ! -d $BACKUP ] && mkdir -p $BACKUP || :

Force full backup

if [ “$FULLBACKUP” == “True” ]; then
FILE="fs-mysite-$NOW.tar.gz"
tar -cvpzf $BACKUP/$FILE $DIRS
fi

Start MySQL 1 Backup

#Get all databases name
DBS1="$($MYSQL1 -u $MUSER1 -h $MHOST1 -p$MPASS1 -Bse ‘show databases’)"
for db in $DBS1
do
FILE=$BACKUP/mysql-$db.$NOW.sql.gz
$MYSQLDUMP1 -u $MUSER1 -h $MHOST1 -p$MPASS1 $db | $GZIP1 -9 > $FILE
done

Conduct boto_rsync

boto-rsync -a [PUTYOURACCESSKEYHERE] -s [PUTYOURSECRETKEYHERE] --endpoint objects-us-west-1.dream.io --chunk-size=10 --workers=0 /home/[PUTYOURUSERNAMEHERE]/bktmp/ s3://[PUTYOURBUCKETNAMEHERE]

Remove tmp files

rm -r /home/[PUTYOURUSERNAMEHERE]/bktmp/*

Delete files from bucket

/home/[PUTYOURUSERNAMEHERE]/bk/clean.sh

--------------end don’t include this line----------

Use this to clean up your old backups from DreamObjects. If you don’t want to let the script clean up your bucket for you remove the last line in bk.sh labeled Delete files from your bucket.

clean.sh (I initially wanted this included in the script above, but this was the only way I could get it working. If you or someone else figures it out, please let me know.)

----------------------start don’t include this line----------

change the number of days to how long you want to keep the back.

/home/[PUTYOURUSERNAMEHERE]/mybk/cleando.sh “[PUTYOURBUCKETNAMEHERE]/backup” “7 days”

--------------end don’t include this line----------

cleando.sh (no edits needed)

----------------------start don’t include this line----------

#!/bin/bash

usage (){
echo " "
echo Usage: s3-del-old “bucketname” "time"
echo Example: s3-del-old “mybucket” "30 days"
echo " "
echo "Do not include a leading slash in bucketname."
echo " "
}

if incorrect # parameters, show usage

if [ $# -lt 2 ]; then
usage
exit 2
elif [ $# -gt 2 ]; then
usage
exit 2
fi

don’t allow leading slash in bucketname

firstchar=${1:0:1}
if [ $firstchar = “/” ]; then
echo "ERROR: Do not start bucketname with a slash."
usage
exit 2
fi

don’t allow “s3:” in beginning of filename

teststring=${1:0:3}
teststring=${teststring,}
if [ $teststring = “s3:” ]; then
echo "ERROR: Do not start bucketname with “s3:”"
usage
exit 2
fi

transform first parameter into fully formed s3 bucket parameter with trailing sla$target=‘s3://’${1%/}’/*’

s3cmd ls $target | while read -r line;
do
create_date=echo $line | awk '{print $1,$2}'
create_date_unixtime=date -d"$create_date" +%s
older_than_unixtime=date -d"-$2" +%s if [[ $create_date_unixtime -lt $older_than_unixtime ]]
then
filename=echo $line|awk '{print $4}'
if [[ $filename != “” ]]
then
echo deleting $filename $create_date
s3cmd del $filename
fi
fi
done;

--------------end don’t include this line----------

Hope this helps!
[hr]
If anyone has a better way of doing this with a ready made script, please share.


#11

Thanks for the update legacy,
I have new questions now:

  1. Do I replace /home/[PUTYOURUSERNAMEHERE]/bk/clean.sh with /home/[PUTYOURUSERNAMEHERE]/mybk/cleando.sh “[PUTYOURBUCKETNAMEHERE]/backup” "7 days"
    to be able to define what bucket and how many days to keep?

  2. Is the call for clean.sh in the first script but “cleando.sh (no edits needed)” a mistake? Or does it mean the script should be called cleando.sh ( with no edits needed to the name but leave it clean.sh in the bk.sh script) and the script itself needs editing?

  3. Is there so way to keep monthly backups plus most recent 7 days but delete the rest?

  4. Is the only thing I need to do for VPS in the instructions at https://help.dreamhost.com/hc/en-us/articles/215916627-How-to-use-S3cmd-with-DreamObjects or do I also need to install boto and/or something else?

Thanks again for your help.


#12

Sorry about that, I was typing all of that on my mobile with some horrible copy and paste from JuiceSSH app on my phone.

Q1. Yeah, I made that slightly confusing…trying to put all of the code up without missing anything was difficult even with my Samsung Note. Also /home/[PUTYOURUSERNAMEHERE]/bk/clean.sh in the bk.sh file should be /home/[PUTYOURUSERNAMEHERE]/mybk/clean.sh. That was my fault copy and pasting from my mobile SSH app. I left out the “my”.

To further answer your question you have three files: bk.sh, clean.sh, and cleando.sh.

The process is something like this:

Cronjob runs bk.sh which makes a call for the file clean.sh at the end. The clean.sh file runs the cleando.sh file with the parameters set in the clean.sh file (bucket name and number of days).

The reason for clean.sh and cleando.sh files is that I cannot get the call working within the bk.sh script that is used in the clean.sh file. Don’t know why, although when I use clean.sh to run the cleando.sh file it works. Working on multiple projects at the moment and this is pretty low priority since it works as is. If you can get it to work by including the line from clean.sh into the bk.sh file and it works, please let me know.

Q2. I made the no edits needed comment so that you will know that nothing needs to be changed with the code in that file (cleando.sh) you will use it as is. The only things you should edit is the [PUTYOUR…whatever…HERE] text in the bk.sh and clean.sh.

Q3. With this script, I am not sure how that would be accomplished. You can set the days for however long you want instead of 7 days and it will keep the files within that range. Anything after 7 days, or however many days you set it to, will be deleted.

This is what I do. I have another script that sends the backup via SFTP to a remote server and I keep those files for a month. I use DreamObjects to store my 7 days worth of backups.

Q4. boto should already be installed on VPS (https://help.dreamhost.com/hc/en-us/articles/217473218-How-to-use-boto-rsync-with-DreamObjects).

As for the instructions for installing S3cmd, I would follow them to make sure everything is setup and ready.

I hope this helps and I didn’t confuse things more for you. Let me know. I’m going to try to edit my original post to make the correction with the /home/[PUTYOURUSERNAMEHERE]/bk/clean.sh where I left out the “my” in the mybk directory.

If you want PM me and I will see if I can send you the files.