Bash scripting help

I’m making a program to backup the forums, only I don’t know enough bash programming knowledge to do this… This is my script so far, it’s only got about 10 errors total, so please help. Thanks.



Start of Config

pathtobackup = “/home/sqwaw/backups”; # path for backups
foldername = date; # name for backup folder date and time stamped
dbuname = “titan”; # mysql username
dbpwd = “MY-PASSWORD”; # mysql password
dbname = “titandb”; # mysql db name
dbhost = “”; # mysql hostname
ftpswitch = 1; # switch ftp on/off # 1=on 0=off
ftphost = “”; # ftp server hostname
ftpuname = “sqwaw”; # ftp username
ftppwd = “MYPASSWORD”; # ftp pass
ftppath = “/”; # ftp path

End of Config


Anything you change below here is done at your own risk!


echo Starting SQL backup!

Move us to the backup folder, or exit.

cd $pathtobackup

Defined correctly in configuration… hopefully.

echo The backup folder WAS CD successfully!

Changed dir to backup folder without complication.

echo The folder backups are to be stored in could NOT be successfully mounted!
exit 0

Couldn’t mount backup folder, exit.


Sorted folders in backup dir

echo Creating sorted folder in $pathtobackup
mkdir $foldername
echo Sorted folder created successfully!
chmod 777 $foldername

Make the dir publically accessible?


Creating SQL Dump

echo Dumping the current SQL database
cd $foldername

Dump file to $pathtobackup using provided $dbuname, $dbhost, $dbpwd, and $dbname

mysqldump --user=$dbuname
–host $dbhost
–password=$dbpwd --flush-logs
–lock-tables $dbname >

echo message

echo The dump was created successfully!

echo message

echo An error has occured, dump not completed!
exit 0

Gzipping the SQL dump

echo Gzipping of the SQL database commencing…
cd $foldername

Go to the sorted dump folder

tar -cf $dbname.tar $dbname.sql

Compress the SQL Dump -> .tar

gzip -9 $DB.tar
echo $dbname.tar.gz sucessfully created!
echo Tarball was not created or Gzipped successfully!

FTP the backup to a remote FTP server

$ftpswitch -eq 1

Login to the FTP server, outputting the screen to ftplog

ftp -n $ftphost <<ftplog

specify a user

user $ftpuname

provide pass


navigate to folder store and put files

cd $ftppath
mkdir $foldername
cd $foldername
put $dbname.tar.gz.enc
put $dbname.tar.gz

quit the ftp


echo ftplog
echo Backup worked… Hopefully!
exit 1

return result

Note - this is just based on a quick once-over. I’m not saying that I’ve pointed out all the errors, but I’ll mention some stuff that sticks out at me. You might want to put in some

 tags or whatever makes stuff show up formated in a fixed font, so that any indenting and other special formatting shows up.

Unless this is a learning project, I’d also suggest seeing if you can find an existing script that you can adapt to your use.

First thing - if you’re not going to reference full paths or use variables to define commands’ full paths, you really should set $PATH explicitly. Something like:

should probably suffice in this case.

[quote]echo The folder backups are to be stored in could NOT be
successfully mounted!


Enclose strings you’re echoing in quotes.
i.e., echo “foo”

[quote]exit 0


No, no. You have your exits backwards. You want to exit 1 in this case, and 0 if everything’s Ok (this is default, so you don’t need to put one at the end).

[quote]pathtobackup = “/home/sqwaw/backups”; # path for backups


I’d use “${HOME}/backups” - not a biggie either way, but this makes it more portable

[quote]foldername = date


That will give you a directory name with spaces in it, which will cause all sorts of problems. Plus, if you run the script twice in a day, it’ll hose or conflict with the first one.

You probably want to do something like:
FOLDERNAME=date " %y%m%d"
FOLDERNAME=date " %y%m%d"$$
(the latter appending the PID of the parent process)
It’s traditional to put variables in all caps. It’s safest to use:
${FOO} instead of $FOO (less ambiguous in certain cases).

[quote]ftp -n $ftphost <<ftplog


Backwards again.

You probably want something like:
ftp -n $ftphost >> ${HOME}/ftplog 2>&1

(the 2>&1 sends both stdout and stderr to the logfile). I’d use scp or something instead, or else maybe use expect to handle the upload. I don’t think what you’ve put will work. You probably need to put the ftp commands in a heredoc.

from a quick google:

If the remote machine is a UNIXy machine, at least, you might be better off pulling the file than trying to push it - wget or fetch should let you specify a password… this will probably work more gracefully than your solution (just make sure that an error is emailed to you if retrieval fails, and make sure your naming scheme makes sense, so that your script on the receiving side looks for the right file).

Another option would be to use mutt to email the backup to you as an attachment.

[quote]echo ftplog


Do you want to echo “ftplog”, or do you want to show the contents of “ftplog”? I’d either just use “tee” in the first place (to log to a file and to your terminal), or do “cat ftplog”

You can reduce this whole long section to something like:

tar cvzf $dbname.tar.gz $dbname.sql || echo “Creating tarball failed” ; exit 1

No dash before the tar options.

also, there’s no sense in making a tarball at all unless you’re putting multiple files in - otherwise just gzip the .sql file. What I’d do is put all your .sql files in $foldername, and then:

tar cvzf ${foldername}.tar.gz ${foldername}

Similarly, you could reduce another big if / then chunk to:
cd $pathtobackup || echo “couldn’t chdir to $pathtobackup” ; exit 1

[quote]chmod 777 $foldername

Make the dir publically accessible?


Uhh not unless you want everyone on your server to be able to read these files…

I’d put a “umask 077” at the top (this will have the opposite effect and set permissions to user-only for files and directories you create, without you having to use chmod at all).

Wow, thanks. I’ll make all the changes you mentioned and report back on the errors I still have left, if any. :slight_smile: