[SGVLUG] Sending backup file to remote/offsite storage

Emerson, Tom Tom.Emerson at wbconsultant.com
Thu Aug 10 11:46:24 PDT 2006


> -----Original Message----- Of David Lawyer
> On Tue, Aug 08, 2006 at 10:43:41AM -0700, Matthew Gallizzi wrote:
> > [in response to my question]
> > 
> > Hmm, what I'd probably do is one of the following
> > 1) If he wants version history, write a bash script to [...]

As it turns out, the Yast backup module does almost exactly that.
Basically, you set the filename (/srv/backups/user.tgz) and the script
first renames any /existing/ "user.tgz" file in the directory to
"<timestamp>-user.tgz" (based on the file's creation/access date, I
imagine) before creating any new archive.  The Yast module also takes
care of the "number of versions to retain..."

> One problem with a big tar archive (file) is that if it gets 
> corrupted, everything could be lost.

This is only the suspenders part of "belt and suspenders" -- the backups
are stored on the source computer, with a copy of the most current going
to an offsite computer

As for overwriting the previous, again, not a problem in this case.  On
the source computer, I keep a history of prior backups (3, actually)
which gives me a couple of chances at recovery.  The file sent to the
"offsite" computer is ONLY the most current backup -- this is for the
(hopefully uneeded) case of replacing the hard drive. (in which case,
"prior versions" are a bit of a moot point...)  The system also only
performs a backup once a week as this is an "end user" workstation he is
using for e-mail and casual web-surfing.  You might argue that this
should be backed up daily (being e-mail and all that...) but then again,
it's only /e-mail/ -- passed-along jokes and tidbits of conversations,
not anything vital or life-n-death.



More information about the SGVLUG mailing list