For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Static Analysis Tools

Published 7/22/04

Let's doff our hats to show a moment of respect for Ada, a language whose promises were huge, yet that mostly failed in the embedded market.

Though some language lawyers delight in bashing technical aspects of Ada, to me its greatest merit was the nitpicking behavior of the compilers. The rule of thumb was "if you can make the damn thing compile it will probably run." Meanwhile legions of C programmers are, at this very moment, debugging mixed up "=" and "==" constructs, tracking down failed mallocs() and hunting for null pointer dereferences. We C programmers manage to seed nearly an order of magnitude more bugs into our code than those working in Ada. It seems logical to use a tool that forces us to generate code that's correct, rather than to crank lots of buggy stuff out fast.

Probably not. Most developers eschew inspections, citing the trouble of rounding up a group of reviewers, an aversion to yet more meetings, or plain fear that shining the cold light of day into the tangled cobwebs of our source files would be mightily embarrassing.

Surely, then, you use an array of static analysis tools, products that delve deep into the code, exploring the jungle of tangled calls and variable relationships? For example, I imagine you augment the C compiler's meager syntax checker with the full throated roar of Lint's (www.gimpel.com and http://www.splint.org) steroid-enhanced analysis.

If not. why not?

Perhaps you're coding in SPARK (http://www.praxis-cs.co.uk/sparkada/spark.asp), an Ada subset that uses intriguing annotations and automatic analysis to yield something like a hundredth of the bug rate of C?

No?

Many of us advocate coding to a firmware standard. Yet it's tough to ensure we're not violating some obscure rule. Why risk myopia by squinting through reams of source listings? Do you use a tool like Codewizard (www.parasoft.com) or QA-C (http://www.programmingresearch.com/solutions/qac3.htm) to automate standards compliance checks?

Brian Kernighan said "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." My bookshelf groans under the weight of software texts that advise keeping functions simple. Convoluted code is bad. So, do you use a complexity analyzer like McCabe's QA (http://www.mccabe.com/iq_qa.htm)?

Sure, McCabe is expensive. But RSM (http://msquaredtechnologies.com/index.html is just $195. If that's too much check out the freebies at www.chris-lott.org/resources/cmetrics/.

Some developers tell me that static analysis tools are merely a crutch for the clueless, arguing that careful thought beats automation. Though it's true that any tool can be misused, the argument taken to its logical conclusion suggests we ignore the compiler's syntax warnings. Stop using debuggers. Ignore the spell checker's screams of agony as you torture the lexicon. Just get it right the first time, every time.

Humans are flawed creatures. We make mistakes. Static analysis finds tough problems fast. If we're professional software engineers isn't it our responsibility to exploit every tool and technique that leads to higher code quality and that shortens debugging?

What's your excuse?