For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Hardware Vs Software

Published 10/05/2001

Howard Smith writes:

"I think it is long overdue for the software community to take a hard look at the tools that the chip community is currently using to do the SOC (Systems On a Chip) designs. Whether the design is described in Verilog or VHDL, there are excellent simulation tools to verify and test the design. These tools are designed to work on three different levels: a behavior level, a synthesized gate level, and a synthesized gate level with delay information. The designer needs to create a Test Bench that becomes the test suite. Test Benches can be designed for the entire SOC, or a subsystem within the SOC, or even just a simple function. The same tool can do all of the testing, even the integration testing.

"I think the software community would greatly increase their productivity if they would just step away from the coding exercise, and think about the software design process. Then they may be able to see where a better set of tools would be a great help.

"One other chilling thought for the software community: I am seeing SOC designs that are now including hardware implementations of things like TCP/IP stacks. Maybe the hardware guys will solve the problem. Their tools can do really cool things like state machines! And, they know that it has to be right. First Time!"

Thanks, Howard, for your perspective. My usual response to the "we hardware guys have to get it right" argument is that the firmware generally implements incredibly complex functionality. But this argument is getting a bit less defensible, with hardware in more and more cases assuming traditional software functions. The Pentium 4 has 45 million transistors, a hugely complex beast by any measure. Yet it works extraordinarily well. A similar bit of code of that size (4 million lines? 40 million?) would typically be rife with problems.

At the risk of simplifying the issues, I think Howard argues that hardware folks invest much more time and money into building reliable simulation and test environments than do firmware people. Their motivation is the extreme costs of mistakes, since any error means spinning a new piece of silicon.

Contrast that to the firmware world: defects have no obvious cost (we're talking pre-release errors, during the normal debugging cycle). Just keep debugging till the damn thing finally works. Or at least until our tests detect no more mistakes.

I've watched a lot of projects go from inception to release (or to the trashcan). It's extremely common to see firmware testing get shortened as deadline pressures intensify. We defer, delete or destroy tests in an effort to SHIP. That just does not happen when designing chips.

Howard suggests that the superior tools of the chip designer are important. Perhaps. I wonder if the developers' attitudes are more crucial. The cost of failure looms in every engineer's mind - and seeds his nightmares - when confronted with the "go to silicon or check the tests" decision. This attitude and, frankly, fear, drives the designers to create near-perfect simulation environments.

That's much less common with any sort of software.

What do you think? Is it meaningless to compare hardware and software development? Or are there lessons we need to learn?