For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Human Failure

"Cessna 47 Quebec, turn left towards Mitchellville."

"Cessna 47 Quebec, roger," the pilot replied.

A few minutes later the air traffic controller more insistently repeated his command. The plane was approaching restricted air space near Washington, DC.

"47 Quebec, roger."

Radar tracks showed the plane was still bearing in on the no-fly zone. Exasperated the controller fairly barked an order to turn immediately, now giving the pilot a compass course. Shocked out of a mental fog induced by the vibration, noise and an assumption everything was OK, the pilot made the course change.

This event, which eerily presaged Wednesday's incident, happened in 1984. I was the pilot.

I had been approaching Andrews Air Force base, home of Air Force 1, from Baltimore. With not much experience in the Cessna 172 I had been busy playing with radios and navigation gear, rather than listening closely to the controller. I'd heard a name other than Mitchellville, and was somewhat puzzled as the plane was already headed for that destination. Why was this fellow asking me to change course to a destination that was dead ahead?

I'd presumed he was confused, not me. Him, a controller with decades of experience, compared to my paltry few hundred hours as a pilot.

As an engineer I'm a great believer in the power of technology to solve problems. But there's always a person somewhere in the loop, and we are imperfect creatures operating in a world where there's less margin for error every year. A hundred years ago people operated horse-drawn vehicles at 10 MPH. Today we zoom down the road at 100 feet per second, hanging inches from each others' bumpers while chatting on the cell phone and eating a Big Mac. One second's inattention and someone dies.

As I write this the media are portraying the pilots who were forced down as bozos. Perhaps they are. But we can be sure that people will always make mistakes - dumb ones, and smart ones. Mistakes stemming from too little sleep or not reading the instructions. Sometimes our equipment confuses us, or one unit displays something a bit different than another, resulting in a bit of head-scratching as our vehicle propels us at breakneck speed through the air or down the highway.

Wednesday the system worked well, ironically mediated by more humans in the loop. An automated system that made decisions to shoot based on estimated threat would probably have knocked these two bewildered people out of the sky. Instead, helicopter and F-16 pilots evaluated the situation, realized the threat from a 1500 pound Cessna 150 was low, and successfully diverted the plane.

We technologists, designing systems for use by flawed humans, must always assume the user will do something stupid, or will be tired or confused. A hospital engineer recently described how on one IV pole he spotted three infusion pumps. all with inconsistent interfaces. Each emitted a different set of beeps in response to changing patient conditions. Imagine the poor intern, 24 hours into a shift, trying to understand what's happening as these things wail and the patient is crashing.

Though our system may work perfectly, accidents will still happen when our user is befuddled. Perhaps the next great arena is developing better machine/human interfaces. Towards that end I highly recommend Don Norman's book The Design of Everyday Things. In it he shows how we haven't even mastered the art of designing intuitive doorknobs. How will we manage much more complex embedded devices?

What do you think? Have you ever gotten into a dangerous situation after being baffled by a device?

(Eventually, in an effort to make the skies safe for mankind, and forced to choose between two expensive hobbies, I gave up flying in favor of the sea. Be very thankful. It's a lot easier to get a license than to be a safe pilot.)