For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle


In 1967 Keuffel & Esser (the greatest of the slide rule companies) commissioned a study of the future. They predicted by 2067 we'd see 3 dimensional TVs and cities covered by majestic domes. The study somehow missed the demise of the slide rule (their main product) within 5 years.

Our need to compute, to routinely deal with numbers, led to the invention of dozens of clever tools, from the abacus to logarithm tables to the slide rule. All worked in concert with the user's brain, in an iterative, back and forth process that only slowly produced answers.

Now even grade school children routinely use graphing calculators. The device assumes the entire job of computation and sometimes even data analysis. What a marvel of engineering! Powered by nothing more than a stream of photons, pocket-sized, and costing virtually nothing, our electronic creations gives us astonishing new capabilities.

Those of us who spend our working lives parked in front of computer have even more powerful computational tools. The spreadsheet is a multi-dimensional version of the hand calculator, manipulating thousands of formulas and numbers with a single keystroke. Excel is one of my favorite engineering tools. It lets me model weird systems without writing a line of code, and tune the model almost graphically. Computational tools have evolved to the point where we no longer struggle with numbers; instead, we ask complex "what-if" questions.

Network computing lets us share data. We pass spreadsheets and documents among co-workers with reckless abandon. In my experience widely-shared big spreadsheets are usually incorrect. Someone injects a row or column, forgetting to adjust a summation or other formula. The data at the end is so complex, based on so many intermediate steps, that it's hard to see if it's right or wrong.. so we assume it's right. This is the dark side of a spreadsheet: no other tool can make so many incorrect calculations as fast.

Mechanical engineers now use finite element analysis to predict the behavior of complex structures under various stresses. The computer models a spacecraft vibrating as it is boosted to orbit, giving the designers insight into it's strength without running expensive tests on shakers. Yet, finite element analysis is so complex, with millions of interrelated calculations! How do they convince themselves that a subtle error isn't lurking in the model? Like subtle errors lurking hidden in large spreadsheets, the complexity of the calculations removes the element of "feel". Is that complex carbon-fiber structure strong enough when excited at 20 Hz? Only the computer knows for sure.

The modern history of engineering is one of increasing abstraction from the problem at hand. The C language insulates us from the tedium of assembly, which itself removes us from machine code. Digital IC protect us from the very real analog behavior of each of the millions of transistors encapsulated in the chip. When we embed an operating system into a product we're given a wealth of services we can use without really understanding the how and why of their operation.

I truly believe that increasing abstraction is both inevitable and necessary. An example is the move to object oriented programming, and more importantly, software reuse, which will - someday - lead to "software ICs" whose operation is as mysterious as today's giant LSI devices, yet that elegantly and cheaply solve some problem.

A friend who earns his keep as a consultant sometimes has to admit that a proposed solution looks good on paper, but just does not feel right. Somehow we synthesize our experience into an emotional reaction as powerful and immediate as any other feeling. I've learned to trust that initial impression, and to use that bit of nausea as a warning that something is not quite right. The ground plane on that PCB just doesn't look heavy enough. The capacitors seem a long way from the chips. That sure seems like a long cable for those fast signals. Gee, there's a lot of ringing on that node.

Practical experience has always been an engineer's stock-in-trade. We learn from our successes and our failures. This is nothing new. According to "Cathedral, Forge and Waterwheel" (Frances and Joseph Gies, 1994, HarperCollins, NY), in the Middle Ages "Engineers had some command of geometry and arithmetic. What they lacked was engineering theory, in place of which they employed their own experience, that of their colleagues, and rule of thumb."

(Our well known ego is nothing new; another quote from the same book alludes to "the hubris of such men")!

The flip side of a "feel" for a problem is an ability to combine that feeling with basic arithmetic skills to very quickly create a first approximation to a solution, something often called "guesstimating". This wonderful word combines "guess" - based on our engineering feel for a problem - and "estimate" - a partial analytical solution.

Guesstimates are what keep us honest. "200,000 bits per second seems kind of fast for an 8 bit micro to process" (this is the guess part) "why, that's 1/200,000 or 5 microseconds per bit" (the estimate part). Maybe there's a compelling reason why this guesstimate is incorrect, but it flags an area that needs study.

In May an Australian woman swam the 110 miles from Havana to Key West in 24 hours. Public Radio reported this information in breathless excitement, while I was left baffled. My guesstimate said this is unlikely. That's a 4.5 MPH average, a pace that's hard to beat even with a brisk walk, yet the she maintained this for a solid 24 hours.

Maybe swimmers are speedier that I'd think. Perhaps the Gulf Stream spun off a huge gyre, a rotating current that gave her a remarkable boost in the right direction. I'm left puzzled, as the data fails my guesstimating sense of reasonableness. And so, though our sense of "feel" can and should serve as a measure against which we can evaluate the mounds of data tossed our way each day, it is imperfect at best.

The art of "guesstimating" was once the engineer's most basic tool. Old engineers love to point to the demise of the slide rule as the culprit. "Kids these days", they grumble. Slide rules forced one to estimate the solution to every problem. The slide rule did force us to have an easy familiarity with numbers and with making course but rapid mental calculations.

We forget, though, just how hard we had to work to get anything done! Nothing beats modern technology for number crunching, and I'd never go back. Remember that the slide rule FORCED us to estimate all answers; the calculator merely ALLOWS us to accept any answer as gospel without doing a quick mental check.

We need to grapple with the size of things, every day and in every avenue. A million times a million is, well, 10**12. The gigahertz is a period of one nanosecond. 4.5 miles per hour seems fast for a swimmer. It's unlikely your interrupt service routine will complete in 2 microseconds.


We're building astonishing new products, the simplest of which have hundreds of functions requiring millions of transistors. Without our amazing tools and components, those things that abstract us from the worries of biasing each individual transistor, we'd never be able to get our work done. Though the abstraction distances us from how things work, it enables us to make things work in new and wondrous ways.

The dark side of complexity is our inability to predict how things will behave in the best of circumstances. Steve Talbot's Netfuture electronic newsletter recently had a fascinating discussion of the perils of complexity. He concluded that "technology is brittle". Things become so elaborate no one understands them. Make a change - who knows what will happen? If an expert system has 2000 rules, and you add one more, how will that effect the other 2000 rules?

The art of guesstimating fails when we can't or don't understand the system. Perhaps in the future we'll need computer-aided guestimating tools, programs that are better than feeble humans at understanding vast interlocked systems. Perhaps this will be a good thing. Maybe, like double-entry bookkeeping, a computerized guesstimater will at least allow a cross-check on our designs.

As a nerdy kid in the 60s various mentors steered me to vacuum tubes long before I ever understood semiconductors. A tube is wonderfully easy to understand. Sometimes you can quite literally see the blue glow of electrons splashing off the plate onto the glass. The warm glow of the filaments, the visible mesh of the control grids, always conjured a crystal clear mental image of what was going on.

A 100,000 gate ASIC is neither warm nor clear. There's no emotional link between it's operation and your understanding of it. It's a platonic relationship at best.

Mentor Constantly

So, what's an embedded engineer to do? How can we reestablish this "feel" for our creations, this gut-level understanding of what works and what doesn't?

The first part of learning to guesstimate is to gain an intimate understanding of how things work. We should encourage kids to play with technology and science. Help them get their hands greasy. It matters little if they work on cars, electronics, or in the sciences. Nurture that odd human attribute that couples doing with learning.

The demise of Heathkit's kit business removed one of the most visible of the electronics playgrounds, but others still exist. Check out Nuts & Volts Magazine (800-783-4624), which is an amazing compendium of ads for parts, kits, and (unhappily) what is probably illicit telephone modification gear. It is fascinating, and is like a virtual junk box.

The second part of guesstimation is a quick familiarity with math. Question engineers (and your kids) deeply about things. "Where did that number come from?" "Do you believe it. and why?"

Work on your engineers' understanding of orders of magnitude. It's astonishing how hard some people work to convert frequency to period, yet this is the most common calculation we do in computer design. If you know that a microsecond is a megahertz, a millisecond is a 1000 Hz, you'll never spend more than a second getting a first approximation conversion.

The third ingredient is to constantly question everything. As the bumper sticker says, "question authority". As soon as the local expert backs up his opinion with numbers, run a quick mental check. He's probably wrong.

In "To Engineer is Human" (1982, Random House, NY), author Henry Petroski says "magnitudes come from a feel for the problem, and do not come automatically from machines or calculating contrivances". Well put, and food for thought for all of us.