Automatically create backups of home directory

apps

#1

Hi!

Does anyone here know how I can automatically create backups of my entire home directory to an external FTP server (such as every day at 2 PM or something)? Is there a cronjob or anything of that sort that would be useful for me? Any ideas? In advance, thanks!


#2

Hi Lenny!
Are you aware that DH offers automatic daily backups? Read about it here:
https://panel.dreamhost.com/kbase/index.cgi?area=2585&keyword=backup

TorbenGB
Try out DreamHost with a free WebIDPrices, options


#3

Its always a good idea to have your own backups as well.

I personally use rsync as it cuts down drammatically on the bandwidth requirements.

I also have a cron job which dumps my mysql databases daily, and keeps the last 5 days. And these get included in the rsync.


#4

A ‘dreamhost-specific’ tutorial on rsync could be a wonderful thing. I’ve looked at various rsync pages I could google up but they mostly scared me half to death.

I guess it would be too hard to make simple though. There are so many possible configurations - one domain, multiple domains, etc.

Hm, maybe a suggestion for a panel-based ‘rsync’ configurator for one-button (or cron) backup :slight_smile:


#5

Chris C, what is the line for cronjob that I can use to make backups of the mySQL? Thanks!


#6

Here is copy of the shell script that I use to dump my mysql databases via a cronjob

---- dump_databases.csh -------
#!/bin/bash -f
cd /home/ccannon/mysql_backup
stamp=date +%d%m%y
mkdir files/$stamp
mysqldump -u -p -h vzforumv2.virtualzoo.net vzforumv2 | gzip > files/$stamp/vzforumv2_mysql_$stamp.sql.gz
mysqldump -u -p -h vzwar.virtual-zoo.net vzwar | gzip > files/$stamp/vzwar_mysql_$stamp.sql.gz
mysqldump -u -p -h photos.nimanoo.net photos_nimanoo | gzip > files/$stamp/photos_nimanoo_mysql_$stamp.sql…gz
mysqldump -u -p -h llukforum.liveleagueuk.com llukforum | gzip > files/$stamp/llukforum_forums_mysql_$stamp.sql.gz
\rm -rf find . -ctime +5 -name \* -type d

This simple shell script, dumps the mysql databases with unique names dependant on the date. And removes all files older than 5 days olds. This prevents the directory filling up if you forget about the cron job.


#7

How do I run that script? I changed what needs to be changed for it to work on my site, and the following error occurs:

[yoda]$ /home/omlettex/db.csh
bash: /home/omlettex/db.csh: /bin/bash: bad interpreter: Permission denied

DreamHost.com webhosting


#8

You can either use the source command

source db.csh

Or you need to change the permissions on the script to make it executable

chmod u+x db.csh

Will change the such that the user (you) will have executable permissions on the file.


#9

[yoda]$ /home/omlettex/db.csh
: unrecognized option
Usage: /bin/bash [GNU long option] [option] …
/bin/bash [GNU long option] [option] script-file …
GNU long options:
–debug
–dump-po-strings
–dump-strings
–help
–init-file
–login
–noediting
–noprofile
–norc
–posix
–rcfile
–restricted
–verbose
–version
–wordexp
Shell options:
-irsD or -c command or -O shopt_option (invocation only)
-abefhkmnptuvxBCHP or -o option

Then if I try source db.csh


[yoda]$ source db.csh
: No such file or directorysqlbackup
mkdir: cannot create directory `files/230105\r\r’: No such file or directory
: No such file or directory

Inside the db.csh file, however, the line reads:
cd /home/omlettex/mysqlbackup

And this folder exists.

EDIT: And now when I use source db.csh, it ends up trying to remove a bunch of files and some are deleted but most get access denied errors.