For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Summary: Securing a system means a lot more than disconnecting it from the Internet.

Several people responded on-line to my article about SCADA security vulnerabilities (http://www.embedded.com/columns/breakpoint/224202612), and others wrote directly to me. A common thread seems to be "disconnect those machines from the Internet!" While I generally agree with the sentiment, there's more to security than installing an air gap.

One reader who wishes to remain anonymous wrote: "A few interesting thoughts, first disrespect leads to carelessness, carelessness leads to very bad things happening. For example 3Mile Island was caused by a person ignoring the warning systems and halting them from scramming the reactor automatically. They said "this can't be happening". [Note from Jack: That was also true at Chernobyl - experts just didn't believe an explosion was possible, even after the event.] "Tin whiskers on PCB's are another example - they can cause dangerous failures. This became a problem because again people became careless (letting politicians decide technical problems is always extremely dangerous ), they forgot why lead tin solder existed (tin whiskers were a known problem solved by using lead-tin solders). "Security is much the same. Before implementing any changes to a system one must decide what failures can happen with such changes. Vandalism, interference, system noise, etc, these are all contributing factors. People forget security isn't putting a chunk of armor on it or a lock box somewhere. Like 3 Mile Island, people must assume there will be unexpected failures and design workarounds. I myself assume that what I am designing is NOT going to work as expected, and I need to figure out what the failure modes will be before wasting ones employers money on it. "I think all of us can learn from the past in this case. Even software programmers can learn from simple things like locking the door (passwords), shutting the curtains (don't broadcast an available portal), turning on the outside lights (watch for intrusion methods and let the intruders know they are being watched). These are all lessons learned over thousands of years, history only repeats itself if we forget about it. Instead of 'common' sense, I believe I would use the word "wisdom of past failures". Many people fail to learn from others mistakes and repeat them as a consequence."

The last sentence of that paragraph summarizes the history of technology: we learn from disasters, if we chose. Alas, it seems the firmware community is way behind civil engineers at such learning, though.

The airwaves this week have been full of chatter about the Google attacks. Some analysts claim these are coordinated attempts by a foreign power. I wonder if anyone is looking for similar intrusions in our SCADA infrastructure? Published April 21, 2010