“My hard drive crashed and I’ve lost all my photos.” How many times have you heard that phrase? Chances are, even if you don’t know a keen photographer, it will have happened to a friend or family member.
There are two types of hard drive. One has failed and the other is failing. They are mechanical, have a finite life and WILL DIE. FACT.
This fact affects everyone, not just photographers. Not just Windows based computer owners either, your swanky MacBook Pro has a hard drive as well and IT WILL DIE! At this point, the technically aware but uninitiated readers will be saying “Yes, but lots of computers now have SSD’s (solid state drives). They’re not mechanical and I’ve got one so I’m safe”. Well, no my friend, you are not. SSD’s also have a finite life and, unlike a conventional hard drive which will often give off signs of impending doom, SSD’s can be working one minute and deader than a DoDo the next. Another FACT.
In short, if you store all your fantastic photos on one hard drive/computer, it’s one of those when not if situations. You then have to add the Pure Stupidity Factor (PSF). The “No, I’m only deleting some copies” that turns into “oh f**k, I’ve just deleted the RAW files”. Power cuts, dodgy USB cables, yadda, yadda. They are all conspiring to remove your photos from your grasp.
Backups. That’s what you need and lots of ’em!
In this post, I’m going to run through how my photos are stored, backed up, backed up again, and again. It’s not a ‘how to’ or a detailed guide. It’s just showing what systems I have in place which will hopefully give you some ideas of your own. I’m guessing it’s not perfect and I did manage to ‘lose’ my entire photo archive a couple of years ago down to one of those ‘pure stupidity’ events. But guess what? I was able to get it all back!
There are two aspects to this process, home and away. Away is pretty manual. Home is automated.
When I am out and about, be it on holiday, at a touring car weekend or whatever, I use a laptop (ASUS Zenbook) for quick and dirty photo editing and culling. After a day’s shooting, I COPY the contents of my memory cards onto the laptop hard drive. Note the emphasis on COPY. The photos are still on the memory cards but also on the laptop. I then also COPY the photos onto a USB hard drive. Everything exists in 3 places. The memory cards are put in a safe place. The USB drive in another safe place. While away, we will be staying in our touring caravan, one of those safe places is NOT in the caravan!
I also have a folder on the laptop set to upload automatically to Google Photos. Any processed images are exported to that folder and will automatically back up to the Google Cloud.
This is where things complicated (to set up) but ends up being pretty much automatic. My descriptions are going to get a bit ‘techie’ but hopefully understandable.
The hardware consists of a desktop PC running LightRoom. No files are stored on this but it does look after some other functions.
I have two NAS (network attached storage) boxes connected to a cabled Gigabit network. WiFi is great for convenience but quickly get’s swamped when shifting a lot of data around. The primary NAS is home-grown, based on an HP MicroServer containing 5 hard drives. One for the operating system – an OpenSource Linux distribution called OpenMediaVault. Two 1Tb hard drives store non-photography data and two 4Tb drives contain photos (so I will deal with those).
Tech-savvy readers will be assuming I have the two 4Tb drives configured as RAID 1 or Mirror (this means that the contents of one drive are directly copied to the other so you effectively have an additional copy in the event of a drive failure). Well no, I do not. Reasons? I’ve never, ever, ever, known a conventional hard drive to just die without some form of precursory data corruption. If that corruption affects my photo files, it will be potentially copied replicated across both drives. Neither does a Mirror compensate for the PSF. If I mistakenly delete a folder or files, they are instantly gone from both drives. No way back.
Instead, I have a Rsync job (this is a Linux file replication command not dissimilar to XCopy in Windows) that replicates files from one drive to another every 4 hours. The key here is that it will update new files but it does not remove deleted files from the second, backup drive. PSF circumvented.
So, we now have files effectively living on two separate drives (albeit in the same box). This box is also continuously backed up to the Cloud using a product called CrashPlan. It’s not free and you have to jump through a few hoops in order for it to work in this environment but it does work, offers unlimited storage, email status notifications and point-in-time version restores. I am not a professional photographer so I use the ‘for home’ version which actually allows us to backup all of our computers. As well as cloud backup, you can also backup to other devices in your account or, if you trust a friend, to anyone with Crashplan installed via a backup code. I run Crashplan on the desktop PC. It can be configured to run on the NAS box, but I like the visual confirmation that it is working by running it on the PC and it also allows me to backup files on that computer.
The second NAS Box is a (now quite old) NetGear ReadyNAS Duo with two 2Tb hard drives installed. These are effectively in a Mirror but everything on this device is retrievable by other means, so the point about a Mirror made above is of little concern.
Once again, I also have the Google Photos Uploader running on the PC. This monitors a folder set up on the second NAS and all processed photos are exported to that folder (I’ve got a structure set up in years and months for most photos, special events with a lot of images will have their own folders). The processed images then get uploaded to Google Photos.
All of the above protects the physical photo files. This is all well and good, but if you have spent hours processing those photos it could be just as pain inducing to lose your Lightroom Catalog! I have Crashplan set to backup the catalog files but they are also located in a Google Drive folder so they are effectively backed up twice, with version based restore.
To my mind, this is a pretty robust system. It’s based partly on the kit I already had (the Netgear box, network hardware etc.) and hardware purchased specifically with the purpose in mind.
As mentioned, I did have to test it a couple of years ago when I was upgrading the drives in the primary NAS Box (I misconfigured Rsync ending up in a mass file deletion on both drives which obviously constitutes pure stupidity). I was able to restore everything from the CrashPlan backup. That amounted to just over 2Tb of data! It took almost a month to restore (the download had to be throttled so as to not affect other services), but everything was recovered.