Jack Ganssle, Editor of The Embedded Muse Jack Ganssle's Blog
RSS Feed This is Jack's outlet for thoughts about designing and programming embedded systems. It's a complement to my bi-weekly newsletter The Embedded Muse. Contact me at jack@ganssle.com. I'm an old-timer engineer who still finds the field endlessly fascinating (bio).

For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

Software Process Improvement for Firmware

January 21, 2019

Software Process Improvement (SPI) is the task of figuring out what a team does right, what needs to be improved, and then making specific suggestions to help the group generate better code, hopefully on a shorter schedule.

My interest is exclusively embedded systems, so my SPI work has been on firmware and finding ways to help firmware teams improve. The issues faced by embedded engineers are different from those writing spreadsheets or developing web pages, for the work we do is always tied to hardware.

It's fun and gratifying to help teams out. It can be frustrating as well. The first thing I almost always find is that there are few metrics collected. The team has no quantitative sense about how they're doing. Some tell me things are peachy and they're world-class; others complain of utter chaos. Sometimes two different members of the same team will give me both answers! But it's extremely rare that anyone can give hard numbers.

Engineering without numbers is not engineering - it's art.

Without numbers, there's no way to compare a team's work to industry benchmarks. Is the team better than most? Average? Or sub-par?

I advise all developers, regardless of how well you think you're doing, to collect metrics. Do this religiously, all the time. Engaged in SPI? Collect numbers. Not doing SPI? Collect numbers. Capers Jones and others publish plenty of comparative data.

Long ago I took a series of classes on selling. As an engineer I found that pretty dreary! But one thing stuck: find a prospective customer's pain point, and address that. Today that resonates as almost every company approaching me for help asks due to some pain they have. The two most common pain points: bugs that surface in the field and create (generally expensive) havoc, or a consistent inability to come close to meeting schedules.

Neither is easy to fix.

The bug problem is the more tractable of the two. There's a line of thinking that we should not call these "bugs"; rather, some prefer "errors." For the majority of problems are indeed mistakes that we have allowed to slip into the code, mistakes that could have been avoided in the first place.

One of the biggest sources of errors/bugs is the poor elicitation of requirements. Two companies I worked with last year operated under regulatory requirements due to the safety-critical nature of their code. They had to conform to various IEC standards. Both did produce voluminous quality reports that showed such compliance. Both generated these ex-post-facto, long after their products shipped, which defied the intent of the standards. I view this as writing all the code, making it work (more or less), and then commenting it.

Requirements are hard to discern. Sometimes really hard. But that's no excuse to shortchange the process. The fact that they are hard means more effort needs to be expended. We'll rarely get them 100% correct, but careful engineering is our job, and part of that is getting the requirements mostly correct.

Then there's design. While many in this industry have deprecated design, we need it now more than ever. And we have a pretty good idea about this. NASA has shown:

ercent effort going to design

So, when it comes to bugs/errors, a big chunk of those come from pre-coding work, or rather, a lack of work. Almost always the SPI effort uncovers a significant shortfall in these areas.

Then there's the implementation phase: writing and testing the firmware. It seems the software zeitgeist today is to write a lot of code, fast. But, did you know the average team spends 50% of the schedule debugging? I guess the other half of the schedule should be called "bugging."

Better: slow down and write great code. Spend much less time debugging it. I could go on at great length here but won't. However, if there's anything we've learned in the last half-century of the quality movement it's that quality must be designed in. You can't bolt it on. A focus on fixing bugs will never lead to great code. Get it right first. Then fix the (few) inevitable bugs.

The agile community focuses on test, and that focus is admirable. I wish more teams could be so relentless about test. But we know that testing is just not adequate; study after study shows that unless one is using coverage, test will exercise only about half the code. Test, yes! But teams should think in terms of many filters, each filter finding some proportion of the bugs. The first filter is having each developer review his/her own code pre-compile, looking for problems. The compiler's syntax checker is another filter. Then there's Lint. Static analyzers. And so much more. Jones' work lists over 60 such filters. It would be insane to use them all, but it's sort of daft not to pick an appropriate (considerably smaller) subset.

And there are metrics! An example is complexity. The following is from a client's product which comprised about 6000 functions.

Firmware complexity

It's clear that complex functions were much buggier than simpler ones. We manage what we measure, and this is one of several quantitative assessments that are important.

Earlier I mentioned that the second big issue is an inability to meet schedules. I'll cover this in another post.

But to end this post with two pieces of advice: Collect metrics. And study software engineering, constantly. There's a lot we know as a profession. Alas, that knowledge often doesn't get applied in actually doing the work.

Feel free to email me with comments.

Back to Jack's blog index page.

If you'd like to post a comment without logging in, click in the "Name" box under "Or sign up with Disqus" and click on "I'd rather post as a guest."

Recent blog postings: