In Praise of Kids These Days

Old farts don't always have it right.

Published in ESP, 2000

For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

In Praise of Kids These Days

 A series of recent columns (December's "A Look Back", January's "Cores, Cards and Tubes" and February's "Reality Bites") brought an enormous response from readers. More than a few emails expressed concern that today's young engineers work at such high levels of abstraction that they don't understand a system's low level details. Reasoning such as "when you sling a million transistors at a time, you never learn the art of electronics" pervaded the responses. My immediate confirmed-old-fart response was to agree, wholeheartedly, with passion.

 But maybe I'm wrong.

 Sometimes the hardest part of getting older is learning to adapt to new ways. Even tougher is admitting that the new ways might indeed be better than that with which we're so comfortable.

 For example, the US markets have grown at an amazing rate for almost a decade. Some pundits gleefully predict that new business models have recession-proofed the economy. After weathering a series of boom and bust cycles over my career I find myself disagreeing. But maybe my view of this is too much tempered by personal experience? Is it possible that things have indeed changed?

 Take the .com companies, firms that annoy every business sense I have. Don't these folks have to make a profit sooner or later? But maybe my model of the business world is flawed. Fact is, Netscape built a business by giving products away for free. Perhaps a sea-change has indeed truly shifted the way business works. When investors give a $20 billion company (Microsoft) a half-trillion dollar valuation, surely this is a sign that old business conventions are, at least, under attack.

 What about the open source movement? Here's a freebie approach that even half-trillion dollar outfits seem to fear. Old fart me is left dazed and confused.

 And so here's a sort of a parable, a story from the olden days, with an attempt to draw some meaning in terms of how things have changed.

 It was tough being a teenaged geek. While the rest of my school chums pursued football and fast cars, dreams of electronics filled my brain and occupied too many waking hours. Girls, sure, they were important so long as they didn't interfere with the important stuff. 

(Now, of course, there are indeed girl-geeks as well as boy-geeks. The younger engineers I meet, though, generally lead much more balanced lives).

 Why were computers so fascinating in those nascently pubescent years? What compelled me, and many of my likewise geek-friends, to learn everything we could about these inanimate beasts? Dreams, surely, since none of us had even the slightest exposure to real computers, other than to watch their whirling tape drives spin in science fiction movies.  It wasn't till junior year in high school that I first actually used a computer. A friend's school had a timesharing account on a Honeywell mainframe, a machine we never saw, located miles away and fed by dozens of phone lines. We used an ASR-33 teletype in the classroom, connected by a 110 baud acoustic modem to the remote machine.

 For $4/hour we could program to our hearts content. Of course, at the time I was making $1.60/hour as the lowest of engineering technicians in a space company doing Apollo subcontracts. The company's computer needs were completely serviced by an IBM 1132 rented by the hour, located off-premises, to which only one VP had access. Once, noticing my interest, he took me along to see the machine, a near-religious experience for me.

 (Thank God the computer priesthood disappeared; computers are now an equalizing force, one that has even been partially credited with the collapse of the Soviet empire).

 The $4/hour charge was so much money I spent an enormous number of hours off-line, checking and rechecking my code before running the paper tape into the machine. But that teletype introduced me to the cryptic world of command line compilers, to one of what eventually became many odd dialects of Fortran, and above all to the discipline of logical thinking. I learned what literal creatures these computers were, demanding above all completeness and correctness. Back then the "do what I mean, not what I wrote" compiler just did not exist.

 (For thirty years we've heard predictions of meta-languages that will eliminate programming. It seems that a program is the smallest complete expression of what an application should do; programming is in fact an exercise in specifying the system, and as such will probably never go away).

 I wanted more computer time, but couldn't afford to keep pouring $4/hour into the school's coffers. Clearly I needed my own machine. So started a succession of designs. Never having heard of assembly language I supposed the machine somehow ran Fortran directly, so created a very simple subset that my crude logic circuits could execute. As soon as one design was done I'd have a better idea and start on the next version. One was bit-serial, with few parallel paths, to minimize hardware. Others were more conventional.

 I actually started to build one or two of these designs. Before becoming a tech I had been a janitor at the same company. Those fat Apollo years led to a rule that we janitors (all unpaid young sons of company engineers) could keep all of the components found on the lab's floor, so my stockpile overflowed with resistors and RTL (resistor-transistor-logic, a prelude to TTL) ICs. The engineers liked to build 3D circuit prototypes, parts sprouting from a vectorboard up towards the sky. They'd junk these creations after working out the bugs; we'd snap them up and unsolder the components, sort them into our own personal bins at home, and then use them as the basis for our own experiments, science fair projects, and eventually computers.

 But for me the ideas flowed faster than construction speed so trashed each project as the next new concept improved on its predecessor.

 (Then as now, building things out of hardware was difficult. But with IP cores, VHDL, and huge programmable logic arrays much of hardware construction parallels writing code as we edit and compile abstract symbols rather than solder gates together).

 With college came an account on the university's various machines and an obsession that permitted no time for class or other extraneous events. I discovered assembly language and operating systems. With as much computer time as I needed at the college (my accounts being, ah, "enhanced" by gaining a rather deeper knowledge of the OS's flaws) personal computer lust made little sense.  But each of the seven deadly sins all defy logic, so my quest to obtain a machine continued to rage.

 An article in Popular Electronics (as I recall) somehow figuring out how to rewire massive cable bundles that had been cut with an ax as the previous owner removed the machine. His biggest expense was air conditioning. Fortuitously, just then the university announced they were getting rid of their 7094, which covered several thousand square feet of floor space. With dreams of mainframe ownership I submitted a bid (zero dollars but a good home), though, since I lived in a dorm room, I had no idea where the thing could go. They wisely donated it to a smaller college. Drats, foiled again! 

Around this time I met an engineer who had managed to purchase, for an astronomical amount of money, a PDP-8 minicomputer with a single tape drive. DEC made thousands of these machines in many different configurations. My friend had dreams of building a small timesharing business around the computer, but the dreams turned to dust as within a few years timesharing became as obsolete as vacuum tubes. Again, though, I saw that personal computer ownership was a possible dream.

 No one seemed ready to donate a computer, and the costs of buying one were simply beyond consideration, particularly when working many hours a week just to pay tuition. The obvious solution was to settle on a design and actually build a machine - any machine.

 Over the years my parts collection grew past the RTL stage to quite a wide range of TTL devices. So design number 8, the last of these efforts, was based on a TTL CPU. God, I wanted a 16 bit machine, but that required too many parts, too much money, and far too much construction time. 8 bits seemed rather primitive (funny, since now almost 30 years later 8 bits is such a huge part of the market), so I settled on a 12 bit architecture.

 Now understanding the concept of machine language, program counters, and instruction registers the design proceeded quickly and reasonably elegantly. My overriding concern was to keep costs down, leading to a rather simple instruction set.

 For those of you who remember the TTL days, I used three 74181 ALU chips (now obsolete, though still available in their LS guise) as the central computational elements. These devices took in two four bit arguments and did one of 16 operations (add, subtract, and, or, etc) on them, yielding a four bit result with carry. Four bits of my instruction word went directly into the 74181's control inputs, letting the computer do any math operation supported by these parts.

 By now it was the very early 70s and Intel had started to make their mark in the memory world with the invention of the DRAM and with manufacturing various SRAMs. A surplus shop sold out-of-tolerance SRAMs cheaply. 36 chips gave me 768 words of memory.

 The girlfriend had unsurprisingly dumped me by now, leaving plenty of time for wiring. Circuit boards were too expensive, as were sockets for the ICs, leading to an implementation of several hundred TTL parts placed on about an acre of vectorboards, with wires (telephone wire from a source one didn't ask too many questions of) soldered to each pin.

 The front panel had an array of toggle switches and LEDs that monitored and controlled the instruction register and a few other critical components. At first I had no real I/O device so would enter and run programs in binary on the switches.

 By now I was sharing a corner of an apartment with some others, but the computer work required so much of my space allotment that for the next two years I moved onto the porch.

 Debugging the completed machine took months of part time work. Not owning a scope my main tool was a voltmeter - not a problem, since the machine was fully static. Running the clock at under one hertz I could track circuit operation quite easily without fancy gear.

 At some point the machine worked! but was useless without a terminal. Months of scrounging turned up an old Model 15 Teletype, a "weather machine" that incorporated lots of strange weather map symbols as well as normal letters and numbers. These beasts spoke Baudot, a five bit code, instead of ASCII.

 The Teletype weighed several hundred pounds and made a thunderous racket as it's half-horse motor spun, sequencing a baffling array of levers and cams. But how could the computer communicate with the Teletype's serial data stream? I eventually wrote a program that "bit banged", transmitting and receiving data by precisely sequencing streams of bits between machines.

 All of this code was entered on the switches, a word at a time. Happily Intel invented the 1702 EPROM about this time. The lab where I worked allowed me to borrow two EPROMs, giving me 256 words of non-volatile storage, enough to hold a very primitive command line OS that accepted programs from the Teletype's paper tape reader, and that interacted with me the programmer.

 So, what did I do with this finally completed computer? Nothing. Nada. Zip. In an old Star Trek episode Spock, finally defeated in his quest for a Vulcan bride, tells the winner that "wanting is often much better than having. It's not logical but it's very often true."

 The 8008 also appeared about this time, more or less making my creation a dinosaur. Life got busier as work demands increased till the computer became a dust collector. Its last home was a dumpster.

 So looking back on this frenzy of design and construction I wonder how it benefited me? Sure, there's no better way to learn everything about the internals of a computer. The process helped me master every nuance of decoding and executing instructions. But so what?

 Listen to the technologies in the story of my homemade computer: Teletypes (obsolete), bit-banging code (if not obsolete, it should be), 1702s and 74181s (both the parts and the PMOS/straight-TTL technologies long gone), tape drives (history except in much more sophisticated guises), timesharing (gone), Fortran (dead except in legacy code) and Apollo (sadly, gone with all of many hopes and dreams). Expertise acquired in these dead arenas is today useless unless translated to today's technologies. 

Very few people design computers now. We embedded people buy pre-fabbed CPUs and hang pre-fabbed peripherals around them. Instruction register? Who cares how that works!

 Engineering is the art of problem solving. We use toolkits appropriate to the technology and the times. No one builds processors out of discrete ICs simply because that approach is inferior to off-the-shelf LSI parts. Those of us with the older skill sets may have great insights into many problems. But too many of us lack the skills of today, like building systems with IP cores, C++, and even Java.

 In the late 60s K&E, the slide rule people, predicted domed cities and other amazing changes in just a few decades to come. They missed the fact that in just a few years their business would disappear, overtaken by the cheap pocket calculator.

 We think of symphonic music as something that undergoes no change, that's trapped in the 18th century. But the fact is that Mozart practically invented the piano concerto, Beethoven shocked the world by writing the first choral symphony, and Charles Ives pioneered atonality. Even classical music evolves. Beethoven would have been an unknown if he composed in the style of Bach.

 The one pattern I've observed about embedded systems is the constant drive towards higher levels of abstraction. Moore's Law is merely a subset of this observation; higher transistor counts let us build ever more complex, abstract, systems. But in parallel tools evolve and application needs increase. Every technological improvement distances ourselves from low level details - a good thing, really, since this is the only way to build bigger and more capable systems.

 An obvious example is C++, which brings with it large overheads and ever more distance from the hardware as we abstract more of the system into encapsulated objects that only the original designers understand. Like software ICs these objects simply perform a function; we don't need to look under the hood to use them.

 And so, when we middle-aged old-farts complain that younger engineers have missed so much by not knowing how an archaic bit of electronics or code works we're attempting, in a way, to preserve a status quo that was never quite so static. The engineers of tomorrow will be tremendously adept at making very complex systems work using tools we cannot now imagine, and that we'll probably never really master.

 Nothing is static, stasis is death. My hat's off to those who master the very latest, the most abstract forms of engineering. Therein lies the future.