Ten Years After

The tenth anniversary column for ESP.

Published in ESP, June 2000

For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Ten Years After

 June, 1990. DSPs were a rarity. Embedded 32 bit processors were virtually unheard of. C++ was an idea rather than a product. In fact, even C was only just coming into its own as a standard language for building embedded systems. Address spaces were small as was the firmware. Products were all far more power-hungry than today.

 June, 1990 was also the first time this column appeared in Embedded Systems Programming. Now, 120 columns, a decade, and too much gray hair later I have to admit to having learned a lot from the readers of ESP magazine and my Breakpoints column.

 In the last decade readers have sent me literally thousands of emails, which, without exception, have been thoughtful and interesting. Even when we have disagreements no one sends ugly flames; instead there's a thoughtful and polite attempt to change my errant thinking.

 And we do have disagreements. I've learned that using the M word (Microsoft) without immediately following it with "stinks" (or something stronger) riles folks up faster than any other subject. The level of passion takes my breath away.

 I've learned that open source is not a fad. When I write about how the open source movement bothers my capitalistic sense - that making money is a good thing - readers respond with compelling reasons why the market may be changing. So compelling, in fact, that I really don't have good arguments anymore against open source. Clearly the market demands new code distribution models, and this is one that seems to satisfy many customers' needs. And, we're seeing that companies are smart enough to find ways to profit from the movement.

 And I've learned that developers have, by and large, a hate-hate relationship with their tools, which is part of the motivation for open source. When pressed, many engineers will sing the praises of a particular compiler or debugger, but the general dissatisfaction level with tools is truly scary. Bugs, poor support, code bloat and a wealth of other problems infuriate the embedded community. Vendors ignore this at their peril.

 While most computer users find the GUIs of today a refreshing improvement over command line interfaces, many developers still find any GUI distasteful. I disagree, since running a half dozen open windows is quite efficient, but have to respect the large mass of readers who long for the simpler and perceived more powerful days of DOS. Though this is  a point of disagreement, we do come together in yearning for tools that easily support automatic operation, rather like the batch files of yesteryear. It's painful to click through a lot of menus to do any repetitive operation.

 I've learned that 8 bits will never go away. Ever. Analysts and pundits have told me they see 8 and 16 bits disappearing over the next year or two, but developers disagree. I'm convinced we're on the brink of an explosion in embedded systems, with embedded processing filling every conceivable niche in our lives. Some of this will be the Internet appliances whose hype saturates all media channels. Much more will be tiny bits of processing, from smart tools to clever pens and intelligent wires. None of these needs a 32 bit monster.

 When I look at how embedded design is changing I see some very intriguing technologies that were either not available a few years ago, or were crippled by cost issues. For instance, IP cores, though not at all cheap, are now available in a wide array of configurations. FPGAs surround standard architecture CPUs. DSPs greatly reduce the amount of analog electronics required in many applications.

 On top of these technology trends, firmware is growing ever bigger, requiring what would once have been considered vast arrays of memory. Time to market windows continue to shrink, which is awfully scary when coupled with increased code size.

 Perhaps the embedded world is headed in a couple of directions simultaneously. At the risk of hazarding some prognostications, I suspect that 16 bits may be a market segment that won't survive. 8 bits is cheaper and typically suffers only a 50% performance penalty. Applications needing the bigger address spaces will probably migrate to 32 bits or 32 bit hermaphrodites, that use a word-sized bus but run internally with 32 bits. The cost differential is rapidly disappearing.

 8 bit applications will probably continue to dominate ultra-low-cost and limited power applications, as well as the middle ground of lower volume moderate cost products. There are a lot of off-the-shelf 8 bit uPs and uCs which are perfect for these sorts of products.

 But I see 32 bits attacking the high-volume medium-to-low-cost product segment that 8 bits traditionally enjoyed. It's pretty expensive to put a big CPU into a small system, but if you're building millions of something, dumping a 32 bit core into a custom ASIC starts to look pretty attractive. As memories become more common on custom chips the total system cost argument will favor bigger processors.

 The harbinger of this trend is the automotive industry, a business where engineering even a foot of wire out of a car is a big deal, given the enormous volumes. Even in this traditionally cost-sensitive area 32 bits CPUs are clearly the future.

 A couple of years ago in these pages I predicted that GUIs would become very common in a wide range of embedded systems, from lawn sprinkler controllers to TV remote controls. Hasn't happened! but I still believe this is inevitable. Customers will demand it; we already see GUIs or GUI-like features on sub-$100 products like GPSes. It's pretty clear, though, that embedded GUIs won't run on 8 bit processors, since to my knowledge no vendor offers a graphical display for less than 32 bit CPUs. So these high volume, low cost apps will require bigger processors.

 

Different Times, Same Tune

The pace of  change in this industry leaves me dizzy. It's impossible to stay current without devoting far too many hours per month reading the trade publications. Yet, despite this seeming pell-mell charge into the future, much remains the same.

 We have so many new hardware technologies available today. Hardware design tools have evolved in sync, as well; today's EDA environment could not have been imagined a decade ago. In fact, back in the early 80s most hardware designers created schematics with pencils on huge sheets of vellum. How quaint! changed from even forty years ago. In 1990 we created designs using (maybe) state charts or data flow diagrams - just like today. Our basic tools were a text editor (as now, with few if any changes), compilers that haven't evolved much, and debuggers which, if anything, are becoming simpler and less capable.

 While it's true that the Unified Modeling Language came into existence in the last decade, the fact is that very few firmware folks use UML, or even have much insight into what it is.

 I've learned that reuse has, so far, failed to fulfill its promise. I remember a decade ago lots of enthusiasm for the "software IC", a wealth of reusable software components that would be as common as digital ICs. Hasn't  happened, and even the dream seems dead. We all learned that reuse is very  hard. Studies indicate that reusable components cost 30 to 200% more than their single-function counterparts. It's awfully hard to invest in the future when the boss is banging on us to ship today

 The NIH ("not invented here") syndrome is as persistent today as a decade ago. NIH is a prime enemy of reuse. I find developers genuinely enjoy writing code, so resisting the urge to build it yourself is tough. Ten years ago some 80% of all RTOSes were homemade. Today, for very good and very bad reasons, the figure has slipped just a bit, to perhaps 70%. NIH wins and reuse suffers when one of the only commercially available "software ICs" gains such little mind-share.

 Fear of the unknown is a reasonable reason for building your own OSes et al. Today more than ever developers worry that off-the-shelf products may have latent bugs lurking. Never have so many safety-critical applications depended on firmware. Something has to happen to make these software components provably reliable. Perhaps certification to FDA and FAA standards will help.

 Maybe we'll see more of these products going the open source route. I can't help but think that Wind River, now the overwhelmingly dominant force in RTOSes, could help themselves and the industry by adopting an open source model for VxWorks and follow-on operating systems. With so few competitive threats there's seemingly little downside. An open source RTOS defeats most of the anti-commercial-RTOS arguments: customers can, if they truly must, maintain the product. They can look under the hood to understand the complex interactions between the OS and their application. And a huge body of very smart people, peering deeply into the code, will both elevate the product's quality and the perception of quality. As an old friend taught me, perception is reality.

 I've learned that despite ten years advance, firmware quality has remained somewhat stagnant. To my knowledge there are no formal studies of this, but my perception is that many of us are wrestling the quality demon with mixed - at best - success.

 We are accumulating experience: firmware is responsible for quite  a few large and small disasters, from the pacemaker that goes awry, to launch vehicle failures, and recently rather dramatic spacecraft failures. As the cost and frequency of these failures multiplies I suspect customers and governments will demand solutions.

 In visiting a lot of companies I've yet to walk into a firmware shop that's certified to any level of the Capability Maturity Model. Not that the CMM is a panacea, but it is one of the very few models extant designed to reign in the chaos of software development. Figures show more and more companies becoming CMM compliant; are these all conventional IT businesses? Why are firmware people so resistant to the idea of adding a software engineering discipline to their processes? Some sort of formalized development strategy is critical to producing high quality code.

 One very interesting proposal for the old SDI ("Star Wars") program showed the potential power of reuse to increase quality. The suggestion: require that all of the code have been reused at least three times prior to its incorporation into the SDI project. Avoid the new, recycle old and proven components.

 But, it is important for us to recognize our successes at managing quality. What's the difference between a PC and an embedded system? More fundamentally, what's the definition of "embedded system"? In the old days this was an easy question; anything with an 8051 or similar small processor was embedded. Now we see embedded PCs, Linux making inroads into the embedded space, and a variety of other changes that confound a simple definition.

 Perhaps the definition of embedded lies in quality. An embedded application runs. Reliably. Rebooting Windows every day or two doesn't  much effect our lives. If we had to stop our car every 20 miles to reboot the engine controller, Detroit would probably go back to mechanical ignitions.

 And so I've learned that despite the increasing quality challenges we're facing, so far embedded firmware is, in general, about the only available model of quality code.

 

Fragments Forever

In looking over the embedded marketplace over the last decade, one glaring thing that remains the same is the horribly fragmented nature of the business.

 In June of 1990 I was in the tool business, emulators specifically, selling to a tiny fraction of the embedded marketplace. Though we supported a dozen CPUs, most sales calls sounded like "no, sorry, we don't do that processor." Then, as now, hundreds of different CPUs vied for a share of the embedded space. Each processor requires its own set of compilers, linkers and debuggers, as well as special developer skill sets.

 The first couple of decades of this industry saw hundreds of mom & pop tool shops spring up. Most failed, a few grew and flourished, the rest grimly held on despite inadequate cash flow. The field is narrower now. Most of those sub-$1million outfits withered.

 A decade ago I thought embedded was the natural realm of small companies serving tiny market niches. Now I see most of the mid-sized business being assimilated, leaving only the gargantuan and miniscule. It's far from clear how this odd mix will serve developers. Or who will survive.

 I've also learned that embedded systems live forever. My email in-box fills with stories from readers who are stuck in maintenance on 15 year old Z80-based products. Or those whose old products have ceased to work because the parts they buy today, with unchanged part numbers, run faster and noisier than their older versions.

 Watching the newsgroups (comp.arch.embedded) one sees constant postings from developers desperate to get old tools. PL/M, to maintain a product that's two decades old. The ancient Borland C for an 80s-era x86 product.

 It's easy to tell people to check their tools into the version control system, but harder to see how anything will remain intact and available decades hence.

 Today's merger and acquisition frenzy means developers often inherit code, sans documentations or tools, while the original engineers gleefully disappear to Tahiti with their newfound IPO millions. So we're still struggling with too much poorly written and inadequately documented firmware.

 Finally, I've learned that Tom DeMarco is wrong. He ruefully complains that software folks don't read. I've found they're hungry - desperate - for information. Readers devour this publication, and write when something is incorrect or not clear.

 So, thank you, gentle readers, for being so gentle in your corrections to my columns, for being so involved and willing to communicate. I've tried to respond to every email and will continue to do so. Don't hesitate to point out mistakes, or even better to suggest new ideas and approaches. I hope to continue sharing this with y'all for a long time to come.