Ack...No More Autobackups to my PC?


#1

So after a scary encounter with all my files (including backups) disappearing I’ve been trying to setup an autobackup to my home PC following these instructions.

http://wiki.dreamhost.com/index.php/Automatic_Backup

Problem is that I don’t have the required nightly.0/nightly.1/weekly.0/etc folders. The folders I have in .snapshot are as follows (and this is just for today).

20070117.003049
20070117.063701
20070117.124605

How do I automate anything with a seemingly random folder name (at least the part after the .)? I dropped an email to support and was told what I was experiencing was a feature of a new backup server and that there was nothing they could do about it. They’ve been very friendly with trying to find workarounds, but so far we’ve come up blank.


#2

That naming scheme does create a problem using the “canned” script in the wiki to automate the back ups.

That said, It doesn’t look to me like they are random - the look like date/time references. Maybe as in:

20070117.003049 = YYYYMMDD.HHMMSS ?

You might be able to confirm that’s the case (I can’t check as I can’t see the files - my snapshot directory still uses the “old” naming conventions), by using ls -la to confirm the proximity of the time the files were created with the filename.

It looks like you’ll need to tweak/reprogram your automated back-up scripts to account for the new naming scheme.

–rlparker


#3

You are indeed correct…YYYYMMDD.HHMMSS (just got word back from support).

I’m fairly novice at scripting so any knowledge you could throw my way would be appreciated. Day to day the times have not been consistent. Nothing ran on the 13th/14th/15th, and then starting yesterday at 6:00pm they have run every six hours within a range of 10 minutes or so.

When I do an ls-la all four backups for today (another just popped up) show as being created yesterday at 18:54.

Seriously…WTH is going on?


#4

[quote]When I do an ls-la all four backups for today (another just popped up) show as being created yesterday at 18:54…
Seriously…WTH is going on?[/quote]
That tells me that my though of confirming the naming scheme isn’t gonna work (though that’s not really needed now, as DH support confirmed it for you :wink: ).

I suspect that “WTH is going on” is that they are “building/maintaining” the backups “off server” (maybe on NFS?) and then “batching” them onto your server space so you can get at them - but that is just a guess.

I’m not much of a shell script maven myself, but mattail (who I think originally wrote the automated backup stuff in the wiki) is still around these forums, and there are others here that might take a crack at that “re-programming” task of accounting for the “new” filenaming convention. :slight_smile:

–rlparker


#5

There’s no need to use the .snapshot directory instead of just grabbing the files from your home directory. I think it just seemed logical to me at the time to use the snapshot for some reason, though I have confirmed with support people that there’s no difference in server load or anything.

So I think if you change the script from this:

[quote]
#!/bin/bash
suffix=$(date +%y%m%d)
cd /home/username/.snapshot/nightly.0/
tar -cf /home/username/backups/archives/domain.$suffix.tar domain.com/ [/quote]
to something like this (mind you this is untested and may not work at all)

Let me know if that works for you, or if I need to be more clear. Perhaps the wiki article should be changed if this is going to be the norm for new customers (and I would assume eventually existing customers)

–Matttail
art.googlies.net - personal website


#6

That’s great, Matt, and it is also really good to see you back in action on the forums again…you have been missed :slight_smile: !

–rlparker


#7

[quote]
I’m not much of a shell script maven myself, but mattail (who I think originally wrote the automated backup stuff in the wiki) is still around these forums, and there are others here that might take a crack at that “re-programming” task of accounting for the “new” filenaming convention.[/quote]
Yep, I’m right here - coming in with a post almost exactly at the same time. :slight_smile:

–Matttail
art.googlies.net - personal website


#8

I’ll give that a shot…Thanks a bunch. Is there a simple way to exclude the .snapshot folder? I’ve got 30G I’m backing up already. Including the eleventy billion copies of my site found in .snapshot would be a killer.

Also - while conversing with Support they recommended adding ‘nice -19’ to the beginning of the the script so that the process watchers wouldn’t kill the script (which was happening to me).


#9

in my updated script above it will only backup the domain.com folder that holds your site content - so it would not be grabbing anything from the .snapshot directory.

Yes, it would certainly be a good idea to put nice -19 in front of the tar command, especially if you’ve got that much data.

When I was writing the script the largest site I was personally dealing with was only 100mb, so I had no such worries.

–Matttail
art.googlies.net - personal website


#10

There may be other directories to back up, like Maildir or logs, and scripts or other files kept elsewhere, like /home/user/bin, etc.


#11

That’s where I was heading, but my complete lack of tar knowledge got in the way.

Is there a way to tar an entire directory except one or two folders(e.g. .snapshot)? I googled myself to an -except switch, but never was able to get it to work…Currently I’m just specifying directory by directory, but a catchall would be much simpler.


#12

As it is on the server I’m using, the “tar” command does not have an “-except” switch implmented.
However, it does have the “–delete” option, so what you’d probably hafta do is first “tar” the whole directory then run a second “tar” to excise the bits ya don’t wanna keep. I’d run a few tests before going “live” with that sorta thing though.


#13

I would recommend listing several directories to be backed up instead of trying to excluded one or delete it. According to the man page for tar you can simply list multiple directories, so your tar line would now look something like:

I haven’t tested the command so I’m not positive that’s right, but it should be. definitely seems easier to do that then go back and delete unnecessary files, especially if it’s very much data.

On a side note, I’m not positive, but I think that if you backed up your entire home directory .snapshot would not be included because it’s a specially hidden directory, and doesn’t show up on any directory lists. Generally the way programs behave is to get a directory list and then specifically access those directories and files. So if the directory can’t be seen, then it stands that it also wouldn’t be backed up. Note that this would not be true for standard hidden files and directories, just the .snapshot directory because of the way DH has configured it.

–Matttail
art.googlies.net - personal website


#14

I’ve been able to successfully tar my directory with the following:

Didn’t need the ‘/’ though it probably makes for easier reading.

Good point on .snapshots being hidden, I’ll give it a shot. If tar does grab .snapshot I’ll just go back to naming the directories individually because at this rate I’ll have 120G worth of backups per day in the .snapshot folder. Just two days worth would put the resulting file over my allotment.


#15

Well, the leading “/” tells it to start at the root path, giving you an absolute path instead of a relative path. Not a problem if you’re sure the location is correct, and being run from the correct path at the beginning, but it is common practice in linux/unix environments to perform operations like these on absolute paths, just so there is no confusion about the location being named.


#16

Yeah, except we don’t have a leading /, but a trailing and I don’t see an absolute path for the folders to be tar’d, just their name.

I had thought the trailing / in some instances would instruct *nix to grab all files and sub-directories…Though, in this instance with out the trailing / I’m still grabbing the subdirectories so what do I know?

Very little I tell you, very little. :slight_smile: