On backups.
Aug. 4th, 2017 03:58 pmThere are two rules of backups.
1. A backup which can't be restored isn't a backup.
2. A backup which requires manual action will eventually not be made.
"Store it on other people's computers" is a valid approach, but while it reduces the likelihood of you doing something to lose the data, it places responsibility on other people to keep your data safe.
And nobody (except your enemies) thinks your data is more valuable than you do.
1. A backup which can't be restored isn't a backup.
2. A backup which requires manual action will eventually not be made.
"Store it on other people's computers" is a valid approach, but while it reduces the likelihood of you doing something to lose the data, it places responsibility on other people to keep your data safe.
And nobody (except your enemies) thinks your data is more valuable than you do.
(no subject)
Date: 2017-08-07 08:53 pm (UTC)I had not one but TWO backup strategies, one online, where I didn't test the restore (See your rule #1) and another which relied on a second hard drive (taken out by a power surge).
I'm thinking about using Glacier for my disaster recovery strategy next time around when I build my NAS, and you can believe that I'm going to test a full bore restore before I call it done.
(no subject)
Date: 2017-08-14 02:41 am (UTC)I have a three-layer backup strategy:
1. Time Machine. It's not subject to network woes and it's private, but it's in the same building and uses the same electrical feed, so it's vulnerable.
2. Automatic cloud backup (encrypted of course), so it's still there if the house gets struck by lightning again, but it relies on other people. I've tested it for small restores but haven't done a full-disk restore. (The friend who recommended the particular service has.)
3. Disk image on an external drive, stored off-site. This does not get refreshed nearly often enough and is why I added #2. (I used to do only 1 and 3.) Maybe I should drop this layer.
(no subject)
Date: 2017-08-30 04:33 pm (UTC)(no subject)
Date: 2017-08-30 05:07 pm (UTC)Suppose that you are using an opaque backup service: you run their software, and X hours later they tell you that you have a backup. (Same applies to odd local things like tape archives, optical disks...) In that case, it is probably not good enough to restore a file or two and check for accuracy: you need to do a complete restore at some point before you need to do it for real. Having spare hardware for this is immensely helpful.
Now suppose you have a transparent backup system, where you copy a filesystem out to a live target and can mount it, move through it, and otherwise be assured that it is a real duplicate of your data. For that, occasional restores (especially testing edge cases -- ownership, other filesystem properties, checksums, large (>2GB) files, sparse files, and so forth) should be sufficient to give you confidence that the whole thing is working well.
Basically, the closer the backup is to a time-delayed mirror, the less evidence you should need.