[SGVLUG] Sending backup file to remote/offsite storage

Emerson, Tom Tom.Emerson at wbconsultant.com
Tue Aug 8 10:13:37 PDT 2006


I've set up a linux system for a friend, and even thought far enough
ahead to set up a couple of cron-automated backup jobs so that if he
hoses something, I'll at least have something I can recover from (though
I'm finding it's painfully slow...)

He recently had some (minor?) corruption on his hard drive, and it made
me realize that the backups are all on the same physical device -- while
this is OK for cases where he munges a config file or some such, but
doesn't do diddly if he loses the drive itself, so I'm formulating "plan
B"

It turns out that SuSE's administrative user interface (Yast) has a
module for creating fairly rudimentary backups and automating them,
which is what I've done (one for the "user" backup of /home, and another
"system" backup of things like /etc, the actual packages that are
installed, and so on)  You have the option of a "plain" tar file,
gzipped tar file, gzipped tar file of tar sub-files, and so on.  About
the only other thing you control is the location of the resulting file
and "how many generations" to keep on disk.

I'm not sure, but I think that the way this works is that the program
first renames any prior instance of the named backup file (based on
cdate?), then creates the new backup -- OR -- it renames the backup at
the completion -- either way, what I typically "see" in the directory
are files named with the date & time (14 digit number) followed by the
name I gave it, so for instance you might see this in the directory:

   20060807030456-user.tgz
   20060807235214-system.tgz

What I'd like to do is create a script to run [some time...] after the
backup to copy the file to my server (via scp, most likely) at a time
when I'm not likely to be using the system (4:45 am, for instance...)
any suggestions on how to go about it?


More information about the SGVLUG mailing list