For novel ideas about building embedded systems (both hardware and firmware), join the 35,000 engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
Is 8 Bits Dying?
Summary: Mike Barr's recent column got readers thinking about the future of 8 bits.
In Mike Barr's recent article (http://www.eetimes.com/discussion/other/4372183/Trends-in-embedded-software-design?Ecosystem=embedded) he makes the predication that 32 bit processors will eventually beat or match 8 bitters. I well remember meeting with an analyst around 1990 who told me with great certainty that 8 bits was dead and everything would be, in the near future, 32.
Reponses to Mike's article are interesting and argue passionately for both sides of the issue. Chuck Manning thinks that decreasing 32 bit prices will push down their smaller brethren as well. I've made this argument many times in the past. When you can get an 8 bitter for a penny whole new applications will open that we can't imagine today.
Chuck also notes that byte-wide processors eat less power, and can tolerate wider power supply voltages than 32 bitters. This is true, and low power is certainly a holy grail of the industry. But I can't see any reason why, sometime in the future, all CPUs won't run off just about any source of energy.
Miro Samek says "8-bitters make no sense." Part of his argument is that the CPU itself is just a tiny part of a typical microprocessor. Most of the real estate is devoted to memory and peripherals. This is a great argument. Except it's couched in the present tense and is therefore incorrect. Today you can buy an 8 bitter for a third (or less) than the cheapest 32 bit part on the market. That's irrelevant for some applications yet life and death for others.
In the future I expect this will change. A Cortex-M0+ in 40 nm geometry requires less than 0.01 square mm of floor space. The CPU itself will eventually truly be an insignificant factor in the transistor budget or die size.
But there are three conflicting and confusing trends that toss a little sand into this discussion.
First, many low end parts are built with fully-depreciated "antique" fabs at geometries that are almost laughable today. Until and unless parts built with more modern processes have paid for their multi-billion dollar fabs, there will be a cost rider associated with the parts.
Second, there's another cost that won't go away. Let's face it: the future of 32 bit microcontrollers is ARM, and ARM collects a tax on each part sold. Those numbers are closely guarded, but I have heard rumors that for Cortex-style devices they run tens of cents. Even if all of the other costs were zeroed out, these devices can't compete in the most price sensitive applications. I've long thought that ARM's biggest competitor is the one that doesn't exist yet: a royalty-free open-source CPU supplied with all of the design support ARM provides. Will this happen? Probably. Will it be successful? One trend in the semiconductor industry has been a move away from support of proprietary tools in favor of the freebies, so an open-source CPU would certainly fit the manufacturers' models. But it's hard to see how a free movement can create the huge, mostly compatible, ecosystem ARM provides.
Third, silicon costs will continue to drop until they become a non-issue for low-end microprocessors. The package will be where all of the money goes, and there's no reason why high- and low-end microcontrollers won't have the same pinouts and packages. Think a six-pin Cortex part.
So, the first and third arguments suggest 32 bits won't cost any more than 8 bitters. The wild card is the second, and it's hard to see how that will play out.
I do disagree with Miro's statement "I think that 8-bitters still thrive only because of powerful non-technical reasons, such as the immense intellectual inertia of the embedded community." No doubt some of that is true, but costs still drive engineering decisions. There's the parts cost, but also that of tools. I've worked a lot recently with ARM's very nice IDE, but it costs thousands of dollars. Microchip, in contrast, makes PIC tools for practically nothing. Sure, you can get GCC for ARM and set up your own environment, but that takes time and more expertise than a lot of low-end developers possess. And some teams demand the support they get from a vendor.
So will 32 bitters win? Probably, as Mike originally said, in the vast majority of applications. Will that be soon? I doubt it.
Published May 24, 2012