For novel ideas about building embedded systems (both hardware and firmware), join the 28,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.
By Jack Ganssle
My son starts college this Fall, fortunately, as I just can't answer his questions about physics anymore. Like most incoming freshmen he'll "need" a new computer. "Need" is of course a relative thing, but I suppose few students use pencils and slide rules any more.
So I found myself at a CompUSA trying to buy a laptop. Service, in this service economy, was of course abysmal. An hour went by as my blood pressure rose. The CompUSA associates knew little and cared less. I left in a huff, annoyed at spending an entire hour unsuccessfully trying to buy one lousy computer.
Later I realized how much has changed in a single generation. An hour to buy a computer? When I started college the microprocessor hadn't been invented. Though companies could afford minicomputers, individuals never owned a machine. The University had a Univac 1108, a $10 million system sporting a 1.25 MHz clock with a meg of memory. 40,000 students competed for time on this machine. Typically it ran - poorly - some 500 jobs at a time, mostly submitted on card decks. The system crashed more often than Windows 1.0 and regularly lost jobs.
And I'm complaining about an hour to buy a computer whose power possibly outstrips all of the computers in the world back in 1971?
Back in those olden days even the smallest mini cost tens of thousands of dollars. And that's when a new VW ran $3k or so. Yet now a few hundred dollars gets one an entry-level desktop.
Even handheld calculators didn't exist. Belt-worn slide rules slapped engineering students' legs, branding them as geeks long before Geeks On Call became an icon.
In the mid-70s HP released their HP-35, a reverse polish machine that offered quite a few useful engineering functions - for $400, when $400 was worth probably $1k in 2005 dollars. Yet now calculators are trade-show giveaways. High school kids "need" elaborate graphing calculators in their Algebra 1 classes. And, having watched them work these marvels, I have to admit it's breathtaking to see a function unveil its mysteries without tediously plotting on graph paper. Remember how mind-numbingly boring that was?
Computers long ago became appliances. Devices we buy rather like selecting a toaster or TV. We "need" them for functions no one would have imagined. I'm utterly unable to write with a pen anymore, editing each sentence a dozen times and counting on the machine to highlight misspellings. It's my revenge on the nuns who spent 8 years tormenting me for lousy penmanship. Now, who writes much of anything without booting a word processor? Does penmanship matter anymore? But PDAs still can't recognize my scribbling, so maybe the good sisters were right after all.
An hour to buy a computer? I remember spending weeks coercing a parental friend just to let me in to see one, a mainframe buried in a NASA basement. And spending all of my lawn mowing money for 110 baud access to a Honeywell machine to learn Fortran.
I eventually did get a machine for my son, a shiny Vaio with dazzling graphics that's giving me laptop lust. And he'll have exclusive access to the thing; it won't be time-shared. He expects just that - hasn't it always been this way? One person, one computer? Few remember those old big iron days. Or that in 1974 when MIT retired their 7094, they found that, due to an OS bug, a job that had been submitted in 1967 was still waiting to run.
Now that's a patient user.
Just a generation ago few people really "needed" a computer. Now we can't imagine running our lives without one, or two, or three. And that's not including the blizzard of embedded processors surrounding us. Sure, it might take an hour, or two, or three to buy one.
But lucid moments I remain astonished that so much capability is available in every shopping mall for so little money.