Follow @jack_ganssle

Small is Beautiful

Small processors are more useful and common than those 32 bit brutes we see advertised everywhere.

Published in EDN, May, 1995

The logo for The Embedded Muse For novel ideas about building embedded systems (both hardware and firmware), join the 27,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype, no vendor PR. It takes just a few seconds (just enter your email, which is shared with absolutely no one) to subscribe.

By Jack Ganssle

Shhhh! Listen to the hum. That's the sound of the incessant information processing that subtly surrounds us, that keeps us warm, washes our clothes, cycles water to the lawn, and generally makes life a little more tolerable. It's so quiet and keeps such a low profile that even embedded designers forget how much our lives are dominated by data processing. Sure, we rail at the banks' mainframes for messing up a credit report while the fridge kicks into auto-defrost and the microwave spits out another meal.

The average house has some 40-50 microprocessors embedded in appliances. There's neither central control nor networking: each quietly goes about its business, ably taking care of just one little function. This is distributed processing at its best.

Billions and billions of 4 to 16 bit micros find their way into our lives every year, yet mostly we hear of the few tens of millions that reside on our desktops. As Nick Tredennick says, RISC and high end CISC is a zero percent market by comparison... yet this zero percent market gets all of the glory. Seems a little odd.

Now, I'd never give up that zillion MIP little beauty I'm hunched over at the moment. We all crave more horsepower to deal with Microsoft's latest cycle-consuming application. I'm just getting tired of 32 bit hype for embedded applications. Perhaps that 747 display controller or laserprinter needs the power. Surely, though, the vast majority of applications do not.

A 4 bit controller that formed the basis for a calculator started this industry, and in many ways we still use tiny processors in these minimal applications. That is as it should be: use appropriate technology for the job at hand.

Derivatives of some of the earliest embedded CPUs still dominate the market. Motorola's 6805 is a scaled up 6800 which competed with the 8080 back in the embedded dark ages. The 8051 and it's variants are based on the almost 20 year old 8048.

8051's, in particular, have been the glue of this industry, corresponding to the analog world's old 741 op amp or the 555 timer. You find them everywhere. Their price, availability, and on-board EPROM made them the nature choice for applications requiring anywhere from just a hint of computing power to fairly substantial controllers with limited user interfaces.

Now various vendors have migrated this architecture to the 16 bit world. I can't help but wonder if this makes sense, as scaling a CPU, while maintaining backwards compatibility, drags lots of unpleasant baggage along. Applications written in assembly may benefit from the increased horsepower; those coded in C may find that changing processor families buys the most bang for the buck.

I'm fascinated with Microchip's PIC16/17 processors, which seem to be squeezing into a lot of low end applications. These are cool parts. The smaller members of the family offer a minimum amount of compute capability that is ideal for simple, cost-sensitive systems. Higher-end versions are well suited for more complicated control applications.

Designer's seem to view these CPUs as something other than computers. "Oh, yeah, we tossed in a couple of PIC16s to handle the microswitches," the engineer relates, as if the part were nothing more than a PAL. This is a bit different from the bloodied, battered look you'll get from the haggard designer trying to ship a 68030-based controller. The microcontroller is easy to use simply because it is stuffed into easy applications.

L.A. Gear sells sneakers that blink an LED when you walk. A PIC16C5x powers these for months or years without replacing the battery. Scientists tag animals in the wild with expendable subcutaneous tracking devices powered by these parts. Household appliances depend on PIC variants.

A friend developing instruments based on a 32 bit CPU discovered that his PLDs don't always properly recover from brown-out conditions. He stuffed a $2 Microchip controller on the board to properly sequence the PLD's reset signals, insuring recovery from low-voltage spikes. The part costs virtually nothing, required no more than a handful of lines of code, and occupies the board space of a small DIP. Though it may seem weird to use a full computer for this trivial function, it's cheaper than a PAL.

Not that there's anything wrong with PALs. Nothing is faster or better at dealing with complex combinatorial logic. Modern super-fast versions are cheap (we pay $12 in singles for a 7 nanosecond 22V10), easy to use, and their reprogramability is a great savior of designs that aren't quite right. PALs, though, are terrible at handling anything other than simple sequential logic. The limited number of registers and clocking options means you can't use them for complicated decision making. PLDs are better, but when speed is not critical a computer chip might be the simplest way to go.

As the industry matures lots of parts we depend on become obsolete. One acquaintance found the UART his company depended on no longer available. He built a replacement in a PIC16C74, which was pin-compatible with the original UART, saving the company expensive redesigns.

In the good old days of microcomputing hardware engineers also wrote and debugged all of the system's code. Most systems were small enough that a single, knowledgeable designer could take the project from conception to final product. In the realm of small, tractable problems like those just described, this is still the case. Nothing measures up to the pride of being solely responsible for a successful product; I can imagine how the designer's eyes must light up when he sees legions of kids skipping down the sidewalk flashing their L.A. Gears at the crowds.

OTP

Part of Microchip's success comes from the aggressive use of One-Time Programmable (OTP) program memory in all of their processor variants. Motorola and others make good use of OTP as well. Now, a number of programmable PLDs are also available in OTP versions.

It's a sure bet that this technology will become ever more important in the future.

OTP memory is simply good old fashioned EPROM, though the parts come without an erasure window. That small quartz opening typical of EPROMs and many PLDs is very expensive to manufacture. You can program the memory on any conventional device programmer, but, since there's no window, can never erase it. When it's time to change the code, you'll toss the part out.

Intel sold OTP versions of their EPROMs many years ago, but they never caught on. A system that uses discrete memory devices - RAM, ROM, and the like - has intrinsically higher costs than one based on a microcontroller. In a system with $100 of parts, the extra dollar or two needed to use erasable EPROMs (which are very forgiving of mistakes) is small.

The dynamics are a bit different with a minimal system. If the entire computer is contained in a $2 part, adding a buck for a window is a huge cost hit. OTP starts to make quite of bit of sense, assuming your code will be stable.

The code can be cast in concrete in small applications, since the entire program might require only tens to hundreds of statements. Though I have to plead guilty to one or two disasters where it seemed there were more bugs than lines of code, a program this small, once debugged and thoroughly tested, holds little chance of an obscure bug. The risk of going with OTP is pretty small.

You can't pick up a magazine without reading about "time to market." Managers want to shrink development times to zero. One obvious solution is to replace masked ROMs with their OTP equivalents, as producing a processor with the code permanently engraved in a metalization layer takes months... and suffers from the same risk factors as does OTP. The masked part might be a bit cheaper in high volumes, but this price advantage doesn't help much if you can't ship while waiting for parts to come in.

Part of the art of managing a business is to preserve your options as long as possible. Stuff happens. You can't predict everything. Given options, even at the last minute, you have the flexibility to adapt to problems and changing markets. For example, some companies ship multiple versions of a product, differing only in the code. An OTP part lets them make a last minute decision, on the production floor, about how many of a particular widget to build. If you have a half million dollars tied up in inventory of masked parts, your options are awfully limited.

Part of the 8051's success came from the wide variety of parts available. You could get EPROM or masked versions of the same part. Low volume applications always took advantage of the EPROM version. OTP reduces the costs of the parts significantly, even when only building a handful.

Tools

Microcontrollers pose special challenges for designers. Since a typical part is bounded by nothing more than I/O pins it's hard to see what's going on inside. Nohau, Metalink, and others have made a great living producing tools designed specifically to peer inside of these devices, giving the user a sort of window into his usually-closed system.

Now, though, as the price of controllers slide towards zero and the devices are hence used in truly minimal applications, I hear more and more from people who get by without tools of any sort. While it's hard to condone shortchanging your efficiency to save a few dollars, it's equally hard to argue that a 50 line program needs much help. You can probably eyeball it to perfection on the first or second iteration. Again, appropriate technology is the watchword. 5000 lines of assembly language on a 6805 will force you to buy decent debuggers... and hopefully a C compiler.

You can often bring up a microcontroller-based design without a logic analyzer, since there's no bus to watch. Some people even replace the scope with nothing more than a logic probe.

An army of tool vendors supply very low cost solutions to deal with the particular problems posed by microcontrollers. You have options - lots of them - when using any reasonable controller - far more than if you decide to embed a SPARC into your system.

Some companies cater especially to the low-end. Most do a great job, despite the low cost. I recently looked at Byte Craft's array of compilers for microcontrollers from Microchip, Motorola, and National. Despite the limited address spaces of some of these parts, it's clear a decent C compiler can produce very efficient code.

Working in a shop using mostly mid-range processors, I'm amazed at the amount of fancy equipment we rely on, and am sometimes a bit wistful for those days of operating out of a garage with not much more than a soldering iron, logic probe, and a thinking cap. Clearly, the vibrant action in the controller market means that even small, under- or un-capitalized businesses still can come out with competitive products.