Categories

Bluetrait
        Bluetrait
                Bluetrait
                    Coding
                    Geek
                    General
                    Videos
                    Solar
                    Coding
                    Geek
                    General
                    Coding
                        PHP
                        Bluetrait
                        PHP
                        Bluetrait
                        WordPress
                            Plugins
                        PHP
                        Bluetrait (Program)
                    Geek
                        Juniper
                        Cisco
                        IBM N2200 8363
                        PCs
                        Spam
                        IPv6
                        Apple
                        NetScreen
                        Internet
                    General
                        Uni

Mon, 28 Mar 2005 5:28 PM

email about setting up a cron file

Michael Dale

I received an email earlier today from a reader who has a better way to manage and run backups. Here is a slightly edited version of the email.


I tackled the same problem at my own company and I think my solutions a little nicer than yours so I thought I'd share it with you :)


Ok so this is how I do backup for the local server at my company:


General Steps:


  1. Take a snapshot of everything I want to backup (every night)

  2. Archive and compress that snapshot (once a week)

  3. Upload that snapshot to a remote location (once a week)

  4. Leave the compressed archive lying around in a samba share so I can  burn it to DVD from my desktop machine whenever I get around to it.

The process is very similar to yours, the main difference between our implementations is that I use rsync to speed everything up and my scripts log their actions. I also wrote a couple of tests scripts to automate testing and prove everything works. The whole thing is a little hacky and I'd like to make the format of the log files look nicer but it does work :) It's also a really really fast way to do backups. The technical details are:


  1. Use rsync to copy your data from it's current folder to the snapshot folder. This is really fast as rsync only copies the differences not the entire tree. Logs on my server shows it usually takes about 3 mins. I call this script 'snapshot.sh'

  2. Use tar to create an archive.

  3. Gzip the archive with a version of gzip that has the '--rsyncable' patch applied[1]. I call the script that does 2 and 3 'archive.sh'

  4. Upload the compressed archive to the remote server using rsync. I call this script 'upload_archive.sh' (I'm really creative)

I wrap these scripts in two scripts called 'everyday.sh' and 'weekly.sh' depending on what needs to happen on each of those days. There also a little timer script that writes the time each command takes.


For connecting to the remote server I use ssh with keys so I don't need to worry about storing passwords in any of the scripts.


The really cool part of this whole thing is in step 4 where rsync will only copy the different bits in the compressed archive. I don't know exactly how this brilliant piece of magic works[2] but it's save me and my clients quite a bit of time and money.


[1] The patch is already in redhat's gzip so hopefully it's in the default freebsd one. Nothing in the man page about it yet though. [2] I first read about it here where the guy provides some explanation of what its doing: http://lists.ubuntu.com/archives/ubuntu-devel/2005-January/003327.html


 


--


Myles Byrne Web Architect