Sunday, December 20, 2009

Thoughts drift to backup technology

As has been on my todo list for a while, is setting up cron jobs for running dump on my file server, as an "Extra" safe guard to the fact, that my data is mirrored across three different computers already, hehe. (I also do periodic backups to a separate cold storage partition, and priority files to CD-R every year or two.)

My main concern there of course being, how to do it without compromising disk space to much, after all we're talking about a lot of crap lol. In writing my test script, I've also experimented with piping dump into lzma for compression, but at least with Vectra's scarce resources, is a bit toooo much for the box to handle the data sets involved. Then I started to think, gee, wouldn't it be cool to just keep a SQLite database that stores logs of changes (cron'd from a script), and then periodically run ZIP on the target, excluding unchanged files since the last backup. Effectively creating a smart form of dump, that functions at a different file system layer (e.g. like tar or cpio).

Then I started to think, well, the best existing solution that I've ever bumped into my travels, is a program called Bacula, but I've never had to to explore it. With a little poking around, it seems that Bacula is very much the kind of system that I would like to have.

Which poses three questions:
  1. How well does it work with OpenBSD?
  2. How well does it handle disk space/compression tasks?
  3. When will I have time to read all the excellent documentation?


So, sadly it will probably be some time after the new year has come and gone, that I'll have time to return to this loop; my RTM updated accordingly. On the upside, if three hard disks in separate locations of the building, and with very controlled data replication patterns, some how fails before then.... the entire building will likely have collapsed, so it would be the least of my worries lol.

No comments:

Post a Comment