For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

What's in the code? Does the boss have any idea?

I'm no cartoonist, but can imagine a Dilbert that proceeds like this:

Frame 1: Pointy-haired boss spouting the usual incomprehensible blather.

Frame 2: Wally scratching his head, the verbal diarrhea of business jargon going in one ear and out the other.

Frame 3: Dilbert thinking to himself. "I know, we'll get rid of this guy by secretly writing code that emits 10x the allowable pollution when not in a test mode. And then we'll leak the secret. Presto, boss goes to jail, our hassles are removed."

Normal business controls like the ones used in accounting aren't useful at all.

How many of us even use requirements tools that identify extra functionality?

The VW case is quite thought-provoking. I can guess at some of it, but have no inside data so can only speculate – and won't even do that. But it's a nice basis for a gedankenexperiment. Let's pretend it was all the fault of a rogue engineer or team. The cheat cost the company at least $15 B, and that's only in the USA where sales of their diesels was limited. This could turn into a lot more money considering the inevitable litigation in the rest of the world.

An angry Dilbert might release something that costs the company dearly. An engineering decision, even one by a single malicious worker, can devastate a company. Yet there really aren't any practical ways to audit for this. Even code inspections won't help as extra nastiness can be injected post-inspection.

"Thieves will out," and "there's no honor among thieves" suggest that clandestine code will at some point become unveiled, but in this fast-paced world the late-arriving truth may only produce a useless blessing over the wreckage.

Financial folk track expenditures on everything from salaries to paper clips. They produce annual reports with an agonizing list of potential risks to the company. Yet I've never seen "we're not sure our engineers are trustworthy" itemized as a risk.

The spy business is aware of this, and warn about chips designed overseas that could include surprising and undesired features.

In the case of VW the stock price fell dramatically, wiping out tens of billions of dollars of value. Some group at the company decided to implant this ugly secret in the code. The stockholders and customers paid the price.

Suppose a shady engineering team – or a single mean, nasty programmer – decided to trash a company's products, and hence its financial viability. It's just not that hard to do.

Another example: does your compiler really produce the code you expect? Is there any chance additional goodies that you don't want get compiled in, back doors that might create havoc? Or send data back to the compiler mothership? Visual Studio does just that, though developers can, once they learn about it, disable the telemetry. Microsoft claims this is a benign behavior designed to improve the products, and I've no reason to doubt that. But this sub rosa functionality would presumably be compiled into code that might go into a product. If your customers discover that your is a raconteur with Redmond's servers eagerly listening to every story, your company might take a serious hit in reputation or even perhaps some sort of liability.

In the safety-critical world compilers and other tools must be validated to ensure they do what they promise, and no more or no less. Few of us use those tools.

In my experience, engineers are routinely decent and have no interest in creating evil products that might subvert a company's objectives. But the VW incident, in this world where malware and threats lurk everywhere, makes one wonder just what a company could do to ensure the integrity of their products.

The world has too many bad actors armed with AK-47s. I wonder what would happen if the weapon was a text editor.

Published May, 2016