For novel ideas about building embedded systems (both hardware and firmware), join the 35,000 engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
By Jack Ganssle
I'll present my Better Firmware Faster seminar in Melbourne, Australia February 20. All are invited. More info here.
Pundits (http://www.desktoppipeline.com/161600216 and many other sources) predict Microsoft's Vista (ne‚ Longhorn) operating system will comprise at least 50 million lines of code. assuming the troubled OS is ever released.
50 million lines of code. The scale is staggering.
Expect a staggering number of bugs.
Though it's easy to poke fun at Microsoft, I'm impressed with the company's recent performance. Windows XP is, at least for me, a very stable product. The much reviled update service seems to be working; reports (http://blogs.zdnet.com/Ou/index.php?p=103) surprisingly suggest Internet Explorer has fewer security vulnerabilities in recent months than Firefox.
But any 50 MLOC program is a monster. How will they test it?
Well-written C and C++ code contains some 5 to 10 errors per 100 LOC after a clean compile, but before inspection and testing. At a 5% rate any 50 MLOC program will start off with some 2.5 million bugs.
Testing typically exercises only half the code. It's hard to devise tests that check rarely-invoked exception handlers, deeply nested IFs and nested loops. So the 50% test coverage number suggests Vista could ship with some 1.25 million bugs.
There are better ways to do testing that do produce fantastic programs. Code coverage, for instance, can insure every branch and conditional has been taken. It's required by the FAA's DO-178B level A standard for safety-critical avionics. But the costs are unbelievable. It's not unusual for the qualification process to produce a half page of documentation for each line of code. A 50 MLOC program's doc might be 25 million pages long, consuming 50,000 reams of paper - a stack 2 miles high. Will Vista undergo this rigorous evaluation? Probably not.
Maybe Microsoft routinely uses a very disciplined approach to software engineering, including the mandatory use of code inspections. Again, the numbers are interesting. Since good inspections typically find 70% of the system's mistakes, after inspection Vista might have 50 million * 0.05 bugs/LOC *0.30 defects left after inspection, or 750,000 bugs. If testing finds half of those, they're still shipping with some 375,000 problems.
What if Microsoft were certified to the highest level of the Capability Maturity Model? Level 5 organizations employ a wide range of practices to generate great software. A CMM5 project typically ships with 1 bug per thousand lines of code. For Vista that works out to 50,000 bugs.
This isn't an anti-Microsoft rant. It's a peek inside the problems any organization has when building huge programs. Though we do indeed have ways to build better code, the costs are huge, and scale exponentially as the program size increases.
The largest commercial embedded systems I'm aware of are some cell phones which have around 5 million lines of code, generally a mix of C, C++ and Java. Though few if any of these companies work at CMM level 5, that 0.1% bug rate would yield 5,000 defects, a hopelessly buggy product. One can only hope that the most important features (like making a phone call) work well enough for most users most of the time.
Firmware size doubles every 10 months to two years, depending on which surveys one believes. Programs are gigantic today, and will be simply unbelievable tomorrow.
What do you think? How can we tame bug rates in these huge applications? Reply to email@example.com, as the response form on this web page hasn't been working. I guess there's a bug somewhere.