For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
By Jack Ganssle
December's issue of MIT's Technology Review (http://www.technologyreview.com/purchase/pdf_dl.asp?79juh=69012&hy6f0=20217, see the picture on page 55) has a dramatic full-color diagram of the Insulin molecule, an exquisitely complex amalgamation of 843 atoms. This, though the simplest of all proteins, only slowly yielded its secrets due in part to the power of electron microscopy.
I was struck by biotech's ability to image and create two and three dimensional models of the microscopic world that's far beyond our human senses. Scientists developed tools that morph a molecule's structure into these strangely beautiful and (to a chemist, I guess) very descriptive views. The ability to see, rotate and manipulate the models leads to insight and understanding.
We in the firmware world also manipulate tiny, unseen objects. Insulin's 843 atoms is a giant of gargantuan proportions compared to the mere handfuls of electrons that form the bits we laboriously. what? Create? Not really; the electrons flow out of the wall socket. Move? No, transistors gate the flow of current. Maybe "organize" is the operative word. But even "organize" implies a process of sorting, which implies there's something being rearranged. One might argue that a program actually exists as charges embodied in Flash memory, or magnetized areas on a disk, or a stream of bits poring over a `net connection. Yet none of those are essential elements of software; it's a fungible non-commodity perhaps best described by not-terribly enlightening word "enthalpy."
Our job is to reduce entropy, to struggle against the Second Law of Thermodynamics. Put that way it sounds pretty noble!
So we're building, or organizing, things of enormous complexity that don't really exist. And we're doing that with extraordinarily crude tools. Microbiologists can look at a picture of Insulin and even build a 3-D representation. They can see exactly how a ketone, for instance, might bond to the molecule.
But we're single-stepping through source code, perceiving only 5 lines at a time of a mega-LOC monster.
Architects build scale models of their proposed buildings so the customer can see exactly what he's getting, and the designer can visualize the structure in something close to its ultimate reality. Civil engineers do the same; they'd never plop a foot-thick structural analysis on the mayor's desk. Aeronautical engineers use 3-D representations on the computer screen to find parts that interfere with each other.
Where visualization techniques don't exist the problem is usually poorly-understood. Neurologists' MRIs and PET scans produce only macroscopic, blotchy patches depicting blood flow. They can zoom in on individual synapses, axons and dendrites with an electron microscope, just as we can completely understand a gate. But such Descartes reductionism hasn't taught them much about how the brain functions. They're still pretty clueless.
Code, of course, is generally incomprehensible and our attempts at abstraction mostly pathetic. Flow-charts, data flow diagrams and even UML drawings are not pictures of a program; they're analogies, different ways of expressing the ideas embodied in the software. But analogy is indeed all we have.
I've tacked the Insulin picture over my desk to remind me of the poor state-of-the-art of software development, and continue to search for other analogies that can more graphically illustrate the meaning of our code.
"Code" is indeed the correct word to describe our creations, for we take a crude specification and transcribe it into a computer program using a ciphering system not unlike that of the German Enigma machine. "for (i=0; i++; i<end);" means nothing to our customers - it's a code that does somehow instruct the machine to do something. But there's no direct correspondence between even those cryptic phrases and bits in the computer - a compiler surely scrambles and optimizes the for loop into dozens of instructions scattered nearly at random over perhaps thousands of bytes of memory. Once smart developers could understand machine code, but now with RISC machines and highly-optimizing compilers that possible only with extreme effort.
The three actors in our drama - customer, programmer and machine - all speak different languages poorly bridged by inadequate tools.
The future surely must hold a different analogy for software development. Our brains have huge vision centers, which is why we understand best via graphics and visualization techniques. But those depictions have to have a ring of the familiar, as does the image of Insulin. We need a different way to visualize computer programs, something that's visceral and dramatic. Something that's a far cry from a 50 pound listing.
Someday we'll have a virtual reality environment that puts programmers into the center of the software, to illustrate data movement and real-time events. There will be a zoom control that lets customers back off from details yet clearly see the features. Managers will get a different view entirely, one that shows details of development progress, quality and costs both incurred and projected.
After all, a picture is worth a thousand functions.
What do you think? Is the future merely more dreary source level debugging and heroic translations of inadequate specs into C?