Running Bash Script from Cron


#1

Hi there, trying to run a variation on the script linked here: :http://www.blog.magepsycho.com/backup-wordpress-project-files-db-using-bash-script/ via the cron. It works flawlessly if I ssh in and just cd into the direcotry and sh myscript.sh manually. But when I try to have the cron run it I run into all sorts of problems. I am not entirely clear on what path structure or structures I should be adding or if calling a script like this from cron is even doable without significant modifications. New to cron and bash, could use some help.


#2

You’re on the right track, and fixing it likely fairly simple. Many forget the step where the script should work from the command prompt first, if it does you’re more than halfway to the finish line.

What errors are you getting? What command line are you using with cron? Are you using the cron interface in the dreamhost panel or manually editing your crontab?


#3

Yes, the script works perfectly from the command prompt.

ssh username@mydomain.com
cd mysubdomain.iwanttobackup.com
sh mybashbackupscript.sh

Boom, everything works.

From Dreamhost Panel I am setting the cron… /home/username/mysubdomain.iwanttobackup.com/mybackupscript.sh

I will have to check on the error I am getting but from what I can tell the script is trying and failing to backup everything under mydomain.com instead if the specific subdomain I am trying to backup and then putting everything in the root directory instead of the backups folder i have specified.

/home
-mydomain.com
-mybackupfolder
-mysubdomain.iwanttobackup.com

How can I provide more information to help troubleshoot this? It works exactly how I want it when called manually from command prompt.


#4

Is cron sending you an email? If so show us the body of that, it should tell us what is working and what is not… You can obscure sensitive info if there is any.


#5

When you ssh in, run the command “crontab -l” and see if the entry for the script works from the command line. If it does, then you probably need to specify environment variables that the script requires. Cron does not run with the same environment variables under the web server or interactive shells.


#6

You say that when you ssh in you have to cd to the directory that you want to back up. It seems likely that the script needs to be run from that directory – but cron won’t know that. Try adding

cd ~/mysubdomain.iwanttobackup.com

near the beginning of the script. (Not as the first line, which is probably #!/bin/bash or similar, but after that.)

You need to be able to run it from the command line without doing the cd first. (Because cron won’t do that for you.)


#7

Ok, I think cd ~/mysubdomain.iwanttobackup.com really helped push this along here. I also had been trying to echo output like echo “$(tput bold)Date:$(tput sgr0) $TODAY” but I guess Cron can’t spit out stuff the same as shell? because in the email out put it says tput: No value for $TERM and no -T specified. Which leads me to another problem… the email notice… seems to be not entirely predictable? Like I’ve gotten a few sporadic emails with the cron set at 10 min all morning. I can’t seem to get any predictability with the email notices. Any ideas what is happening there?

When you mention running the crontab -l command… you mean

15 * * * * /usr/local/bin/setlock -n /tmp/cronlock.3721486.174425 sh -c $’/home/username/mysubdomain.iwanttobackup.com/_mycronshell.sh’???

I think I’m getting close here. On a non cron note are there any methods to encrypt a password in a bash script so it’s not just plaintext?


#8

“tput” stands for “terminal put”. It only works in scripts that are actually running in a terminal, not scripts that are running noninteractively, such as from a cron job.

As far as the timing goes, how long does the script take to run from start to finish? The “setlock” bit in the job makes it so that the script won’t run if there’s already another instance running. (Which is a good thing; trying to run the script again if it hasn’t finished from the first time yet is an invitation for trouble.) So if the job takes more than 10 minutes to back up your site, it won’t run as often as you expect.


#9

Ah, it’s only less than 150mb but let me set it to run once an hour and see what happens. That’s a good point about setlock.


#10

ok… seems like this is working now. perhaps the setlock every 10min was conflicting with things somewhat… seems strange for a under 150mb but servers can be strange sometimes. I set it to 30 min and have consistently been getting emails now and successful crons. Now originally this was a shell script with output for terminal. It’s kind of irrelevant to get an email that says things like ‘Archiving… Done!’… wondering what info would be useful to receive in an email via cron? Or is it usually enough to just get the email letting you know cron was completed successfully.
[hr]
Here is the modified bash script as of now that I pulled from the original link I posted. It seems to work well. A couple things… it would be cool to have the backup archive organized where there is a www and a database directory that the subsequent files go in per archive. I’m a bit confused how to make that happen. Also, what permissions should I have the script at… 700? Concerned about the security of this file. I had another backup script I was using that I just entered the plain text user and password in that I was uncomfortable with. This one pulls it from the config file… not sure if that’s any more or any less secure. Thoughts? Thanks for all the help.

[php]
#!/bin/bash

cd ~/mysub.domain.com

#/************************ EDIT VARIABLES /
projectName=mysub_demo
backupDir=/home/username/_backups
#/
//EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST “wp-config.php” |cut -d “’” -f 4)
username=$(grep DB_USER “wp-config.php” | cut -d “’” -f 4)
password=$(grep DB_PASSWORD “wp-config.php” | cut -d “’” -f 4)
dbName=$(grep DB_NAME “wp-config.php” |cut -d “’” -f 4)

Initial setup

TODAY=$(date)
echo “----------------------------------------------------
Date: $TODAY
Host: mysub.domain.com automated backup”

Backup DB

echo "----------------------------------------------------"
echo “Dumping MySQL…“
mysqldump -h “$host” -u “$username” -p”$password” “$dbName” | gzip > $fileName.sql.gz
echo “Done!”

Backup files

echo "----------------------------------------------------"
echo "Archiving Files…"
tar -zcf $fileName.tar.gz * .htaccess
echo "Done!"
echo "----------------------------------------------------"
echo "Cleaning…"
rm -f $fileName.sql.gz
echo “Done!”

Move to backup directory

echo "----------------------------------------------------"
mkdir -p $backupDir;
echo "Moving file to backup dir…"
mv $fileName.tar.gz $backupDir

Keep last 30 Backups

echo "----------------------------------------------------"
echo "Removing old backups…"
find $backupDir -type f -mtime +30 -exec rm {} ;
echo “Backup of Complete!”
[/php]


#11

As far as cron emails go, they are useful for debugging. period. nothing more, nothing less. The script you prsented apprently was written (at least originally to run interactively thus the wording like “Archiving… Done!’” that means more interactively than it does via cron email.

I suppress or turn them off all emails from cron. Not that my scripts don’t send an email, but they only send me an email that I need to deal with, i.e. errors, failures and problems. To do that, cron doesn’t do the emailing the script itself does.


#12

Yea, I think I am going to remove most of those echos. What would an example of an error be that would be useful to be sent via email. What permissions should this file be set at?

I am reading up on Cronic for error handling. http://habilis.net/cronic/. Anyone use this? Looks like it needs to be installed in usr/bin/…

Here is the updated script without all the output code.

[php]
#!/bin/bash

cd ~/mysub.example.com

#/************************ EDIT VARIABLES /
projectName=mysub_demo
backupDir=/home/username/_backups
#/
//EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST “wp-config.php” |cut -d “’” -f 4)
username=$(grep DB_USER “wp-config.php” | cut -d “’” -f 4)
password=$(grep DB_PASSWORD “wp-config.php” | cut -d “’” -f 4)
dbName=$(grep DB_NAME “wp-config.php” |cut -d “’” -f 4)

Backup DB

mysqldump -h “$host” -u “$username” -p"$password" “$dbName” | gzip > $fileName.sql.gz

Backup files

tar -zcf $fileName.tar.gz * .htaccess
rm -f $fileName.sql.gz

Move to backup directory

mkdir -p $backupDir;
mv $fileName.tar.gz $backupDir

Keep last 30 Backups

find $backupDir -type f -mtime +30 -exec rm {} +
[/php]

Do I need to be wrapping each step in some kind of if else to do error checking?

like

[php]
if [ “$?” = “0” ]; then
#some command
else
echo “Error. Couldn’t do some command!” 1>&2
exit 1
fi
[/php]


#13

Ok, I’ve tried updating this with a couple more checks and handling. Could use some advice flying by the seat of my pants here.

#!/bin/bash

cd ~/mysub.example.com || exit

#/************************ EDIT VARIABLES ************************/
projectName="mysub_demo"
backupDir="/home/username/_backups"
#/************************ //EDIT VARIABLES **********************/

fileName=$projectName-$(date +"%Y-%m-%d-%H%M")
host=$(grep DB_HOST "wp-config.php" |cut -d "'" -f 4)
username=$(grep DB_USER "wp-config.php" | cut -d "'" -f 4)
password=$(grep DB_PASSWORD "wp-config.php" | cut -d "'" -f 4)
dbName=$(grep DB_NAME "wp-config.php" |cut -d "'" -f 4)

# Backup DB
mysqldump -h "$host" -u "$username" -p"$password" "$dbName" | gzip > $fileName.sql.gz

# Backup files
tar -zcf "$fileName.tar.gz" * .htaccess &&
	rm -f "$fileName.sql.gz"

# Move to backup directory
mkdir -p "$backupDir";
mv "$fileName.tar.gz" "$backupDir"

# Keep last 30 days of Backups
find "$backupDir" -type f -mtime +30 -exec rm {} +

curious if just taking the wp-config grep stuff out of this, and hard coding it in a separate file permissions 600 then doing the following would be more secure?

source /home/username/mydomain.com/userandpasswordwhatever.sh