For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Embedded - A Volatile Business

Published in Embedded Systems Design, 2008

The embedded industry is one of the least studied high-tech arenas on the planet, despite the vast number of products we produce that change people's lives. Embedded.com conducts surveys from time to time, as does VDC and EE Times. I recently gained access to data collected and analyzed by EmbeddedMarket Forecasters which kept me glued to the computer for several hours, as they market both the data and a tool that allows one to create complex and fascinating crosstabs.

About 500 people filled in surveys so the dataset, while not huge, is a similar size as used by other studies. I always view studies with some suspicion, especially when they're used to bolster some quantitative argument. Similar questions posed to respondents of different studies invariably yield different answers. It's better to view them as impressionistic art; the outlines might be a bit fuzzy but they paint a qualitative picture. If 53.9% report using C++ it's probably safer to conclude that "about half" of developers use that language. (Actually, there's a more subtle issue at play: what does "using C++" mean? Using a C++ compiler to compile C code? Using C++ in a procedural manner? Taking advantage of the OO constructs?)

Then there's a need for a common-sense filter. In this study 23.2% (whoops - around a quarter) report using Visual Basic in response to the question "which language do you currently use for embedded designs?" I can't argue with the data, but find this statistic a bit hard to believe. Every study I have read about this market contains similar credulity-stretching numbers for all manner of questions. I chalk this up to respondents not reading the "embedded" qualifier carefully, but could be wrong, as happens 53.426% of the time.

Despite these disclaimers it's fun and enlightening to look at the data. Here's a peek at some results I found interesting.

Development times are generally short: 40% reporting under a year, and only about 13% are chained to one of those often-dreary two year or more projects.

How big are teams? The answers vary widely, of course, but the mean is 7 software developers per team (I presume that means per project). But the distribution is heavily skewed to smaller groups as reflected by the median value of 3.

Only half as many people develop hardware, with reported mean and median values of 3.6 and 2 respectively. Nearly 70% employ fewer than 3.

Then there are the others: management, marketing, and all of the other parasites nice folks who are considered part of the product team. There's about one of those for each hardware and software engineer.

Not surprisingly the size of the code base has the same lopsided distribution. Though the average project consumes a bit over 100K lines of new code, half are in the 1KLOC to 50KLOC size. Nearly 15% are tiny things needing under a thousand lines. Though the public perceives computers as massive multi-gigabyte computational juggernauts, we embedded-heads often use just a little bit of smarts to solve an engineering problem. And that's pretty cool.

Zero percent report programs using over 5M lines of new code. Only about 10% work on systems needing over 100KLOC.

Factor reuse into the equation and the numbers appear to change considerably. Now 30% are working on systems with more than 100KLOC. 20% report not knowing how big the code base is, which suggests that many developers seem to be healthily isolated from acquired IP by an API. There's hope for reuse.

Or is there?

The most popular commercial OS? Windows, in all of its various flavors. Linux is a not terribly-close number two followed by VxWorks and QNX. All the others are in the noise. Oddly, uC/OS-II wasn't included as a possible answer; other data I've seen peg that RTOS in the top five.

About 10% have shipped a product in the last year incorporating a commercial distribution of Linux. Slightly more have shipped using a "free" version of that operating system. "Free" is in quotes because that includes those who have done their own port, which can be expensive.

Another head-scratching result: The top three criteria for selecting an OS are real-time performance, acquisition cost, and availability of source code, all of which argue against Windows. And in a clear warning to some OS vendors, respondents overwhelmingly prefer perpetual licenses over subscription models.

86% claim to prefer floating licenses over node-locked versions. While this question was asked in the OS context, I suspect most of us feel the same way about our IDEs and other tools. Vendor alert: Developers are unhappy with dongles and other protection mechanisms! Though your concerns are certainly legitimate, you, along with the RIAA and MPAA have to find a way to satisfy your customers. Don't, and watch continued growth of FOSS alternatives.

The report contains a great deal of information about which CPUs we use. The overall patterns are intriguing. It appears almost 80% of us are using 32 bitters, yet only around 20% have adopted the ARM. Yet, in their most recent filings, ARM claims to have "shipped" a billion CPUs in their last quarter. It appears that ARM, Intel and Freescale (in that order) have essentially the entire 32 bit market sewn up. No other vendor makes it out of the noise.

The 8 bit world is owned by Microchip and Atmel, in that order, an interesting finding considering Microchip's recent efforts to acquire Atmel. Non-Atmel 8051s come next, followed, again, by offerings from Freescale. Both Microchip and Freescale had strong financials in 2008.

60% of those surveyed report they don't use 16 bit CPUs; of those that do, the clear winner is TI's MSP430. x86 variants, Freescale's 68HC12 and 68HC16 along with parts from Renesas make up the bulk of the other two-byte chips used.

Nearly 40% report using a DSP part, unsurprisingly, most picked TI devices. Essentially everyone in the DSP world use parts from TI, Analog Devices, or Freescale.

FPGAs are everywhere with 60% claiming to use one or more. Xilinix and Altera are the heavyweights here, with Atmel and Actel eating up most of the rest of the market.

C continues to dominate at 70% usage, a number that dovetails nearly exactly with other data I've read. C++ lands at a bit better than 50%, which also mirrors other studies. But when I've questioned C++ users carefully it turns out that only about 20% use it as an object oriented language; the rest are just crafting C with a C++ compiler.

Assembly hangs in at around 35%, not too surprising since every project has at least a BSP. With growing team sizes typically only one or two are needed deep in the hardware.

30% of us use Eclipse. That's an astonishing number considering that the Eclipse Foundation came into being just five years ago.

While agile techniques are the rage, and a lot of us rebel against the heavyweight Capability Maturity Model (CMM), about 10% use agile and another 10% CMM. An interesting new paper called "CMMI or Agile: Why Not Embrace Both?" (Software Engineering Institute, November 2008, http://www.sei.cmu.edu/publications/documents/08.reports/08tn003.html) shows the benefit of combing both approaches. Though I'm not sure I buy all of the arguments, it's a thought-provoking read.

Developers worry about the things we've agonized over for decades. The top concerns are poor requirements, lack of resources, and an inadequate schedule, in that order.

Management concerns abound. Half report that engineering has no input into budgets and schedules. Is it any wonder so many projects are so late? Even fewer claim to have open communication with the bosses.

It gets worse.

Two thirds of us get no training on our tools. Even more of us feel management does not quickly resolve conflicting directions to the team.

The study contains a lot more data; it's a for-sale publication available from Jerry Krasner, (jerry@embeddedforecast;www.embeddedforecast.com). Thanks, Jerry, for letting me share so much of the data.

Volatile Catastrophe

All C and C++ developers simply must read the paper "Volatiles Are Miscompiled, and What to Do About It" (Eric Eide and John Regehr, Proceedings of the Eighth ACM and IEEE International Conference on Embedded Software, Atlanta, Georgia, Oct 2008, currently at http://www.cs.utah.edu/~regehr/papers/emsoft08-preprint.pdf). Reader Bob Paddock sent me the link, and thereby wrecked my day. But thanks, Bob, this is important.

Simply put, compilers sometimes miscompile code using the volatile keyword.

In automated tests the researchers ran on thirteen compilers every one generated incorrect code for accesses to volatile variables. The best of the lot miscompiled only 0.037% of the time; the worst was 18.72%. The first doesn't sound so bad, but even a single such error could cost weeks of debugging the generated assembly code. The error rate of the worst compiler is downright terrifying.

It's unclear how these errors apply to your code; it seems some code constructs are fine while others aren't. And there's no data to suggest that the compiler you're using isn't perfectly fine. but then no one suspected the thirteen either. After reading this article I'm left with zero confidence that any other compiler is any better.

The authors speculate that the problems stem from an essential conflict between optimization and the nature of volatiles.

They make several recommendations. The first is the use of helper functions which encapsulate accesses to volatiles, thereby outfoxing the optimizer. It's important that the functions aren't inlined, of course, and don't use macros. The helper functions eliminated 96% of all volatile miscompiles.

Next, check the compiler's output. I hate this suggestion, though it's probably wise. The whole idea of using a high level language is to distance ourselves from the assembly.

Finally, leave optimizations turned off on modules that rely on volatiles.

It's sort of scary that we're building hugely complex systems on fundamentally unsound tools. One wonders how many other constructs miscompile as well.