Using CRON to save logs?


Can anybody tell me how I can get CRON to (ideally) email me my Dreamhost logs before they get deleted? Failing that to get it to save them somwhere.

I recently had my site hacked and I reckon it happened weeks ago, and of course the logs have expired and been deleted.

Hope someone can help?


1 Like

Emailing them on a daily basis is a bit overkill, but you can do this in the Panel under Goodies -> Cron Jobs:
Select your user
Give it a title
Email out to your email address
Command will be: /bin/cat ~/logs/EXAMPLE.COM/http/access.log
When to run: Custom
Selected minutes: 45
Selected hours: 23
Every Day, Every Month, Every Day of Week

The emailing out of the job will include its output, so you don’t need to redirect the cat command.

If you want to just keep a giant logfile, make the command:
/bin/cat ~/logs/EXAMPLE.COM/http/access.log >> ~/biglog.txt

This will just grow the biglog by appending each day’s log onto the end. You’ll need to manually delete or rename it as it gets to big to comfortably browse through it.


To archive my logs, I use a daily rsync cron job:

rsync --include=’/’ --include=’.gz’ --exclude=’*’ -avl logs/ logs_archive/

That command recursively copies all the gzipped files from logs to logs_archive. The nice thing about this setup is that the files stay as daily compressed files – small and easy to incrementally backup offsite.

To protect against cracking (or LA sliding into the sea), I remotely mirror (via rsync over ssh) my DH user’s home directories to my laptop. After that my regular backup software (TimeMachine) takes over and incrementally backs up to an external hard drive.

Even if a cracker deleted all my DH files a month ago, I will still have a copy of them in an incremental snapsot on my backup drive. The important thing for recovering from vandalism is to have good offsite/offline snapshots. Cautionary tale from Slashdot:

1 Like


Thanks for the idea. Yours didnt work for me, but i tweaked it so it worked.

Mine saves to this location: /home/MYUSERNAME/archive/
And I created a folder for the error logs and a folder for the access logs (to keep them nicely organized), and I only copy over the *.gz files.

Here is my code (2 separate cron jobs):

rsync --exclude=’/’ --include='access.log.gz’ --exclude=’*’ -avl /home/_domain_logs/MYUSERNAME/ /home/MYUSERNAME/archive/access

rsync --exclude=’/’ --include='error.log.gz’ --exclude=’*’ -avl /home/_domain_logs/MYUSERNAME/ /home/MYUSERNAME/archive/error

NOTE: the cron job was not able to create the directories…so manually create the archive, access, and error directories

I added a cron to backup the database as well:

// with compression
mysqldump --opt --user=DBUSER --password=DBPASS DBNAME | gzip > /home/USERNAME/archive/database/db-backup-date +%Y-%m-%d.sql.gz

// without compression
mysqldump --opt --user=DBUSER --password=DBPASS DBNAME > /home/USERNAME/archive/database/db-backup-date +%Y-%m-%d.sql


1 Like

I’m on Dreamhost VPS. Will this command still work 8+ years later to archive the files to another directory on the server? I want to take the logs files for all of my domains and archive them.

Is Rsync still the best way to move the log files from the server to my computer or Dropbox, Google Drive on a routine basis?

I’m haven’t touched a command line sine the early 2000 and even then I was doing pretty basic stuff link grep’ing log files to troubleshoot problems on my then employers websites.

The command I use now is basically the same – slightly adjusted to only archive http/https logs (avoiding other cruft in ~/logs/):

rsync -avL \
--include='/*/' --include='http/' --include='https/' --include='*.gz' \
--exclude='*' \
~/logs/ ~/logs_archive/

Here’s the full script I use:

Is Rsync still the best way to move the log files from the server to my computer or Dropbox, Google Drive on a routine basis?

Certainly rsync is a good way to backup to your own computer, especially if it is a unix machine. I don’t know the specifics of backing up to Dropbox/GDrive/etc.


Thank you so much. I appreciate it!