[SGVLUG] Sending backup file to remote/offsite storage

Matthew Gallizzi matthew.gallizzi at gmail.com
Tue Aug 8 10:43:41 PDT 2006


Tom,

Hmm, what I'd probably do is one of the following
1) If he wants version history, write a bash script to tar the files that
you feel are crucial (/home, /etc..) on his system, put them in a cronjob,
have the .tgz be saved to the same location, then have rsync copy over the
files in /home/backups or something to your server.
2) If he just wants backups and version history isn't important, then rsync
his /etc, /home, and whatever else you want to a location on your server.
Set rsync to only copy over the files that are either updated or new.

This is the way I would do it. Good luck.

On 8/8/06, Emerson, Tom <Tom.Emerson at wbconsultant.com> wrote:
>
> I've set up a linux system for a friend, and even thought far enough
> ahead to set up a couple of cron-automated backup jobs so that if he
> hoses something, I'll at least have something I can recover from (though
> I'm finding it's painfully slow...)
>
> He recently had some (minor?) corruption on his hard drive, and it made
> me realize that the backups are all on the same physical device -- while
> this is OK for cases where he munges a config file or some such, but
> doesn't do diddly if he loses the drive itself, so I'm formulating "plan
> B"
>
> It turns out that SuSE's administrative user interface (Yast) has a
> module for creating fairly rudimentary backups and automating them,
> which is what I've done (one for the "user" backup of /home, and another
> "system" backup of things like /etc, the actual packages that are
> installed, and so on)  You have the option of a "plain" tar file,
> gzipped tar file, gzipped tar file of tar sub-files, and so on.  About
> the only other thing you control is the location of the resulting file
> and "how many generations" to keep on disk.
>
> I'm not sure, but I think that the way this works is that the program
> first renames any prior instance of the named backup file (based on
> cdate?), then creates the new backup -- OR -- it renames the backup at
> the completion -- either way, what I typically "see" in the directory
> are files named with the date & time (14 digit number) followed by the
> name I gave it, so for instance you might see this in the directory:
>
>    20060807030456-user.tgz
>    20060807235214-system.tgz
>
> What I'd like to do is create a script to run [some time...] after the
> backup to copy the file to my server (via scp, most likely) at a time
> when I'm not likely to be using the system (4:45 am, for instance...)
> any suggestions on how to go about it?
>



-- 
Matthew Gallizzi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.sgvlug.net/pipermail/sgvlug/attachments/20060808/47870baf/attachment-0001.html


More information about the SGVLUG mailing list