Engineering Ethics

For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Published in Embedded Systems Desing, October, 2002

Engineering Ethics

In 1985 I sold six In-Circuit Emulators to the government of Iraq.

Today Iraq is part of the "Axis of Evil"; then they were our allies, partners in the US mid-east strategy. I suppose our government still had some reservations about sending high tech equipment there, since getting a valid export license took six months and several trips to the Department of Commerce in Washington, DC. But the approvals arrived and we shipped the gear. Iraq paid on time. We never heard a word from them again. Where were these things used? Were they part of a humanitarian medical program or a used in the development of  weapons of mass destruction?

At the time Iraq was engaged in a mortal struggle with Iran, whose hostage crisis just a few years earlier had helped topple Jimmy Carter's presidency. Iran was our enemy, Iraq our partner in the region.

After the 1990 invasion of Kuwait two burly men unexpectedly showed up at the office, sporting suit coat bulges in the wrong places and business cards pretending employment by the Customs Department. The phone number was one I recognized from many years spent in the intelligence business, one that led to Langley, not DC.

These gorillas scared the receptionist, but were polite and most interested in all of the paperwork associated with the years-old sale. Every document related to the export licenses was duplicated and questioned; the products' technical capabilities and features went unexamined.

We were squeaky clean and quickly satisfied their inquiries. Yet I was left with a nagging fear that our completely-above-board sale may have created harm for someone. Was being legal the same as being ethical?

Fast forward to 2002. Everything has changed. Our relations with Iran are warming, those with Iraq headed from bad to worse. Today it might be possible to send equipment to Iran, but surely Iraq is a no-ship-zone.

Though our paperwork satisfied the government inspectors I was left in a quandary. What was right? Should we have sold these products to such an unhappy part of the world?

Like most tool vendors we had distributors in all of the major technology-consuming countries of the world, particularly in Western Europe and Japan. These are "safe" counties, bound by treaties, members of various mutual defense pacts, whose democracies neither oppress their people nor antagonize the world. Yet one cannot ignore the shifting alliances of Realpolitik. My parents were part of a generation where even some of these countries were quite at odds with America and others.

The enemy is a nebulous character whose identity varies with the times.

Can we separate our engineering efforts from politics? In my na've youth we claimed that everything is political, that even personal relationships impact the world at large. Those nearly-Marxian dialectics were more the product of sophomoric philosophies and frustrations with the times. But now I cannot help but wonder if indeed even the actions taken by us little people have some grave consequences of international scale. It's perhaps audacious to suggest that the sale of a handful of emulators for 8 bit processors may have contributed to significant evil. But did my sale of these products, this decision to ship taken by inconsequential me and tiny company increase the amount of evil in the world?

A too-easy choice that might satisfy a search for ethical behavior might suggest shipping high tech products only to the most benign of countries. But suppose these tools were used to build products that pumped clean water into villages in Pakistan? Generate power for a community in Tibet? Clean up pollution in Romania?

How does one make these decision? Searching one's own soul, especially one much more grounded in ones and zeroes rather than international politics, and the answer's likely to be wrong. The law of unintended consequences means we cannot understand the implications of our actions over the long haul. It can twist our very best efforts into the worst of results. Yet, in my opinion, we cannot abdicate our responsibility to strive for the maximum amount of good. We engineers are the architects of the new order, and as such must consider how our work will wreak good or evil.

My experience with the Iraqi sale taught me that it's not enough to simply trust the government to decide. Ethics are personal.

ESC Musings

These thoughts came to mind after facilitating a Shop Talk discussion on engineering ethics at June's Chicago Embedded Systems Conference. I raised the big issues of products and politics, using the crisis in the Kashmir as a focal point. What does that mean to those doing business with either India or Pakistan? Does arming the belligerents constitute a violation of our moral responsibilities? I was looking not for an answer, but for discussion and thoughtful insights.

The group wasn't interested. Perhaps those locales are too remote; maybe I'm too far off into metaphysics for nuts and bolts engineers. Maybe people feel powerless and unable to effect change on regional scales.

To my frustration several people suggested that we should simply do what the boss wants. I'm sure most people would react appropriately to an outrageous order. But big evils grow from the small decisions. The Nuremberg trials brought monsters to the dock. But what of the little people, the clerks who stamped the papers ordering a family onto the train to Dachau? If those people had even a glimmer of the consequences of their actions then they were active participants in an unimaginable horror. Yet they were doing what the boss ordered. Small cogs in a huge machine who were perhaps unable to effect change. Does that excuse their actions? Does it excuse ours when we follow the company's marching orders even when we're uncomfortable with the possible outcomes?

So I asked the ESC group about other ethical issues. A letter in the RISKS digest stirred up a hornet's nest when the author suggested that ignoring buffer overflow problems is far more than bad this oh-so-preventable bug continually costs customers big bucks. It's trivial to write code that's immune to a buffer overflow attack, yet so many of Microsoft's products, for instance, have been repeatedly compromised. Doesn't Gates et al have an ethical and legal responsibility to their user community to fix at least this simple vulnerability?

Most of the attendees agreed, but it's easy to blame others, and easier to target Microsoft.

Ethics are Personal

Merriam Webster (College Dictionary, 10th edition, 2000) defines "ethics" as "the discipline dealing with what is good and bad and with moral duty and obligation." Duty and obligation - the ethical life is not that of a moral dilettante; it's one that encompasses sometimes heavy burdens.

I was struck by comments made by the commencement speaker at a high school graduation this week. He said ethical behavior means understanding the difference between right and wrong, and then accepting both the responsibility and the  accountability to do the right thing. Wow - we have to seek out accountability for our actions! Too many of us avoid that at all costs.

The crowd was full of happy parents and relatives, each cheering on their own graduate. Many parents had tears in their eyes; quite a few were single moms who had raised these kids more or less alone. Where was dad? Divorce does not mean abandoning the responsibilities for children. Dads should be held accountable for doing the right thing.

These are difficult times. Kennedy's "ask not what your country can do for you" admonition seems a quaint reminder of kinder times. This feels like the gimmee and get-rich-quick era. The headlines speak of corruption and dissolution. Enron, Anderson, Tyco, Merrill-Lynch, and too many other corporate names splayed across the front pages in recent (June) weeks suggest that corporate America is the realm of sleaze, that CEOs will do anything, legal or otherwise, ethical or not, to inflate stock prices and build personal wealth.

It's up to each of us, individually, to effect change. Ignore the headlines. Act ethically. Deal with the agony of divorce but take care of the kids. One's own perceived needs pale compared with the responsibilities incurred with procreation. Our responsibilities transcend the rules set down by the courts or the Congress.

This week I met an electrician at a local boatyard, one who volunteered many free extracurricular hours to help rewire a friend's mast. George is one of the nicest, friendliest, and funny fellows at that yard. What a guy! Later I learned he had manipulated the company's health care system to wangle 8 months of disability pay for what was the most minimal of injuries. Though not at all an uncommon practice in today's scheming society, this seems deceitful and unacceptable to me. At work as at home we must conform to the highest of standards. Even when it's really hard.

The Brooklyn Bridge is one of America's icons. The first large suspension bridge, it incorporated many new construction ideas including the massive use of structural wire rope. Roebling both designed and constructed the bridge, yet his company, the best wire rope vendor of the day, lost the wire contract to another firm. They provided, knowingly and with almost criminal intent, substandard material that could have jeopardized the safety of thousands of commuters. Only Roebling's sensationally-redundant design saved the project. The history of civil engineering is filed with stories of crooked contractors and lousy materials. Pursuit of the quick buck always tosses ethics to the winds.

We developers have a responsibility to our customers that closely parallels Roeblings' corrupt wire rope vendor. Quality is hidden deep inside the product. No one sees the guts of our creations. A simple user interface might conceal hundreds of thousands of lines of code. Is it beautifully structured or a spaghetti mess?

A couple of the Shop Talk attendees voiced what we all know but seldom admit: we're lazy. It's easier to hack stuff out than do it right. Disciplined development is a core value of any workable approach to reliable firmware, but it's tedious and, well, disciplined! Banging away on the debugger making motors whir and lights flash is a lot more fun than sitting in front of a desk thinking! especially when the cubicle is so noisy that deep thought is impossible.

I'm struck by the correlation between beautiful and reliable code. The Therac 25, the earliest known embedded disaster, killed several people. Proximate causes included bad design and completely unstructured, unaudited, and totally convoluted code. A British helicopter accident resulted mostly from firmware so awful the review board gave up trying to read it after getting through a mere 17%.

Check out the uC/OS RTOS (www.micrium.com). Read the C listings. Then check out the source code to Windows CE. One is beautiful, written almost like poetry, with pride and discipline. The other looks like the vague mutterings of an insane software terrorist. One is safety certified to DO-178B standards. The other, well, let's just say it's great in easily-rebooted hand appliances but I wouldn't fly in a plane controlled by it.

The beauty of the great code lies deep in its innards, invisible to any consumer. It's elegance cannot be observed functionally. You'll never hear a customer say "hey, this thing hasn't ever crashed!" Yet the beauty that stems from making difficult and ethical development choices yields great, reliable, portable code.

It's a standard we must all hold ourselves to.

Think Globally, Act Locally

Act now. Do the right thing in your daily engineering efforts. Neither wait for others to take the lead, nor expect the boss to define the path to righteous development.

Do you check for buffer overruns? Vast bodies of experience shows that input strings from untrusted sources crash code. Skip these trivial checks and you're writing code that increases the unhappiness in the universe.

If you use malloc() do you check its return value? We all know that heap fragmentation can lead to malloc failures, so I'd argue that writing code that assumes success is more than poor design; it's unethical. It's dumping problems that we should deal with ourselves onto our users.

Since no one knows how to allocate stack size, there's not the slightest doubt that more often than we'd like to admit we get it wrong. Guess too small and the gadget may crash erratically or only long after shipped. Isn't it an ethical requirement to proactively monitor stack size?

Lots of us feel cranking code is a lot more fun than detailed design, code inspections, and adherence to standards. Yet I contend it's worse than lazy to jump into coding: it leads to lousy products, frustrated users, and is a fundamentally unethical way to build a product.

Is it ethical to accept an arbitrary, capricious, and impossible delivery date from the boss? That's engaging in a dysfunctional cycle of lying that's doomed to get worse.

Ethical behavior means accepting responsibility for your actions. As Harry Truman said, "The buck stops here." On your desk. Not your boss's.