For novel ideas about building embedded systems (both hardware and firmware), join the 28,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
By Jack Ganssle
A technician in Alaska accidentally wiped out the electronic records of a $39 billion fund.
According to http://www.iht.com/articles/ap/2007/03/20/america/NA-GEN-US-Lost-Data.php during maintenance he reformatted both the main and backup hard drives. For some reason the backup tapes were unreadable.
According to the web site no one was blamed for the $39b error. The commissioner said: "Everybody felt very bad about it and we all learned a lesson. There was no witch hunt."
I would have fired those responsible for backups. There is no excuse today for losing data. The technology is widely available and cheap.
Ironically they were able to recreate the data from the ultimate backup source: 300 cardboard boxes containing paper records. Some may remember that England created a digital version of the Domesday Book during the 80s. Just a decade later the laserdiscs were obsolete and unreadable. The original 921 year-old manuscript is still in good condition.
Paper may be the ultimate backup media, but it sure is hard to use. Some sorts of data, like software, may never exist on the printed page. And paper is no guarantee of preservation considering how many libraries have been destroyed by fire.
It's not just libraries. In the last couple of months three companies have contacted me with tales of massive fire damage. One will probably fold. Insurance covered the equipment losses but all of their data is gone. The backup tapes were consumed in the inferno.
Hurricane Katrina should have taught us about keeping off-site backups. way off-site since a single event can take out an entire city.
We engineers don't seem to be much better than Alaska's government. In 1999 the FAA lost critical software for controlling flights around O'Hare when a disgruntled programmer deleted all of the non-backed-up code from his computer.
In August of 2003 the Electronic Frontier Foundation announced their FTP site had been hacked, and they had no backups.
A 2002 survey in SD Times showed that 40% of developers don't use a version control system. Without a VCS it's impossible to do disciplined backups. The VCS stores source code and other files in a central database that gets backed up daily (one hopes) by the IT folks.
The data is probably more valuable than any other asset most companies own. Why do we tolerate sloppy backups?