For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

Software Liability Laws - Part 1

Summary: A proposed software liability law is fun, perhaps, but na‹ve.

Recently Poul-Henning Kamp, writing in ACM Queue (http://queue.acm.org/detail.cfm?id=2030258) proposed a new set of laws to protect users from harm caused by software. Mr. Kamp first quotes an excerpt from a Ken Thompson's Turing Award lecture: "You can't trust code that you did not totally create yourself." This quote may have been appropriate in 1983 when Mr. Thompson delivered the lecture, but is patently absurd today. We do trust a lot of code. Whether it's the code in your microwave or that which injects fuel into your car's engine, we confidently expect software to work. It generally does.

The article proposes a law written in three clauses. Let's take them apart one at a time.

This clause is supposed to cover malfeasance such as that demonstrated by Stuxnet, but what does "intentional" mean? (I hate to sound like a lawyer or a particular ex-president, but it will be lawyers arguing the cases). Suppose a doctor for years ignored important advances in medicine. Generally he gets away with this, but then a patient suffers harm or dies. I have no doubt a court would find him guilty of malpractice, and would consider his actions intentional. Ignoring advances in his field is an intentional action, mirroring the old dictum "failing to make a decision is itself a decision."

One could make a pretty persuasive argument that, for instance, since the use of a sloppy development process is always intentional (because we decide what and how we'll do things), the inevitable results (bugs) are in effect intentional. Further, there's a wealth of literature that demonstrates how poor processes translate into defects, so the developers, or their bosses, are in a position analogous to the doctor who didn't avail himself of the latest techniques. In fact, there are papers in the literature which have called certain development strategies "professional malpractice." This is not to say that the only standard is perfection. The best of doctors lose patients. It means any team whose processes aren't high quality could be liable. And, honestly, that could ultimately be a good thing! As Mr. Kamp humorously writes: ".any pundits and lobbyists they could afford would spew their dire predictions that "this law will mean the end of computing as we all know it!" To which my considered answer would be: "Yes, please! That was exactly the idea."" Then there's the second clause: Clause 1. If you deliver software with complete and buildable source code and a license that allows disabling any functionality or code by the licensee, then your liability is limited to a refund. This homage to open source is na‹ve. The vast majority of our users don't know the difference between a zero and a one. They simply haven't the skills to understand the source, let alone modify it to disable some functionality. Even for those of us who can, we have neither the time nor the toolchains. We're surrounded by software, millions and millions of lines of the stuff. The trust that Ken Thompson eschewed is the only way we can practically cope with this web of technology.

If Novartis supplied me with a mass spectrometer and the chemical formulation for my blood pressure medicine I still would not have the skill to test it for purity. In this supremely-complex world we rely on the expertise of others. The fact that the medicine's chemical structure is printed on the instructions that accompanies the pills does not absolve the manufacturer of liability for a bad batch.

Suppose the regenerative brakes in your hybrid car fail due to a software error, but it did come with a disk containing the source code. Does the auto maker get a pass? Five people dead, perhaps, and the only liability is the price of the vehicle?

I'll hit the third clause next week.

Published December 14, 2011