For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

By Jack Ganssle

Published in ESD December 2008

A Look Ahead

Welcome to the 40th anniversary issue of Embedded Systems Design "magazine." That last word is an anachronism only oldsters recognize. For three decades ESD produced a magazine, which, in its early years, was printed on "paper" (organic material formed into flat sheets which could be marked with symbols) and was actually moved physically from manufacturing facilities to engineers' desks. At one point the "postal system" used over a quarter million vehicles to move materials, mostly junk mail, to peoples' homes and offices.

Magazines evolved from this printed form to, briefly, an all-electronic version that exactly mimicked the printed version, to a streaming feed of random thoughts (see "blog" and "RSS") created by anyone with an ax to grind. That incessant babble eventually approached a zero-information state and was replaced by today's holographic VR Embedded Systems Design Space in which most users immerse themselves while their automover drives them home.

How much has changed!

Perhaps one of the biggest differences in embedded developer's work over the last two decades has been the dissolution of the notion of hardware and software as separate entities. When was the last time you actually wrote code or designed a circuit? Graphical modeling tools and FPGAs signaled the inevitable though few realized it at the time. Bigger, cheaper programmable logic, coupled with problems of unimaginable complexity meant engineers eventually started designing systems as a whole: blocks that had certain functions and interacted in well-defined ways. Tools translated these concepts into an optimum mix of transistors and code, balancing performance/size/cost concerns versus the hardware/software mix. No one looks at code or transistors anymore, we work entirely at a 3D graphical problem-domain model level.

Once bulky power supplies converted power delivered by a huge continent-wide grid of power-generating stations to levels appropriate for the particular application. Some portable devices did operate off of "batteries" for a short time, but had to be frequently reconnected to the grid to be recharged.

When in 2014 Electricit‚ de France complained that the Large Hadron Collider had stopped both making contractually-required payments for electricity and consuming any of the giant company's power, LHC scientists were forced to admit they had accidently created both a micro black hole and an anti-hole, and had learned to harness the resulting matter/antimatter reaction to power the lab. Fear of global destruction quickly changed to an explosion of venture funding of MBH (micro black hole) power startups. The world-wide depression brought on by housing's collapse six years earlier changed to a new economic bubble, one we still ride today.

It's hard to believe we once plugged all of our appliances in to a wall socket (if you live in an older home you may see these vestigial remnants of yesteryear under the wallsensepaper). Today most devices use a molecular-sized MBH encapsulated in a nano-scale containment torus. Big factories use an pinhead-sized MBH massing a few thousand tons. All contain sophisticated electronics and code to mediate the reaction.

Remember the comically-quaint Mr. Fusion from that ancient Back to the Future movie? Doc had to feed it banana skins and cheap beer from time to time to keep on the move. Today automovers run a lifetime on a single MBH.

20 years ago pundits predicted the death of 8 bit systems; the availability of essentially free power hardened those views. But market forces were not to be dismissed. Though 8 bit technology always lagged the bleeding edge it did follow a parallel path. So today's 8 bitters use otherwise-obsolete 22 nm geometry. Devices are resultantly extremely small, invisible, really, once we learned to do away with packaging, and practically cost-free.

The result: entirely new types of applications that were unimagined not long ago. The Johnny Appleseed metaphor of scattering sensors to the wind was surpassed long ago. One of the most innovative results is the control of the Anopheles mosquito, which once caused so much suffering on this planet. Bioengineers created sex pheromones which attracted the bugs to a food stash; there a spray field attaches tiny embedded systems to their wings which identifies humans and steers the mosquito away. And of course the smart hemascrapers circulating in our blood to dissolve plaque have been a boon to the fast food industry.

But what about the future? What will the embedded landscape look like 40 years hence?

That's the wrong question. We've learned that embedded systems and lifestyles evolve in lockstep. One has to look at the two as a gestalt.

I/O devices, for instance, have changed a lot since the ancient days of computers. Punched cards gave way to keyboards, to virtual keyboards, and to voice- and gesture-input. Output even today remains largely a presentation of words and drawings, though every home entertainment system uses total-immersion virtual reality. (We're all familiar with the virtfam syndrome, those poor individuals never able to form an intimate relationship who create entire virtual families. They come home from work, pop into VR, and spend the entire evening interacting with perfectly human-looking simulacrums, editing out the sickness and death, conflicts and battles that are part of the human condition.)

VR's bandwidth is still no more than the speed one can interact with the artificial environment. Recent work in machine-brain coupling promises a direct neuron-stimulating approach that could dump big chunks of information into one's head at fantastic rates. Embedded systems will identify the structural elements of each person's unique brain and form the conduit that channels data directly from machine to cortex. A scientist looking at experimental results will instantly just seem to know what happened. Patients' complete medical histories will pop into the physician's head just before the examination.

A back channel may ultimately be found as well. Think it and your computer acts on it.

When pondering the future two decades ago I thought that by 2028 we'd have merged biotechnology with electronics. Hasn't happened. Though they do work hand-in-hand the two technologies still remain distinct. In the `00s science hailed the sequencing of the human genome and predicted great advances in the short term. That hope turned sour as we learned that, instead of reading the code, we had assembled what was little more than a partial index into a giant encyclopedia. Facts are not the same as knowledge, and the genetic code is so fiendishly complex it has defied most of our attempts to understand it.

Perhaps our struggles with DNA proves popular interpretations of Godel's Theorem: how can we understand our own DNA? But advancing technology may make that question moot. Last year's introduction of Quantum Computing Engine 1 (QCE 1) that has solved every NP-complete problem fed to it suggests that computer performance will shortly far exceed any human mental capability. We're still unique in our emotions but it's clear a QCE-derived machine will be able to tackle problems of staggering complexity. Since the machine is currently operating full-time to design it's successor it's reasonable to think that QCE 2 - or 3, or 15 - will be able to puzzle out the functioning of complete DNA molecules. Couple that understanding (well, at least the machine's understanding) to its design ability and I think we'll find biological structures coupled to machines in remarkable ways. By 2068 hemoscrapers will be as archaic as the cell phone. Maybe humanity will look like the Borg. Surely our vision and other sensory apparatus will be improved. Will reasoning be enhanced? Lifetimes extended, perhaps redefined as data moving between biomechanical vessels? I hope philosophers focus on the ontology of being human. and wonder if the next great philosopher will be a QCE.

Money will disappear within the next 40 years due to embedded technology. By that I mean much more than the what happened when the US adopted the Euro in 2015, and then abandoned all cash the following year. (One still sees old cash used as insulation, but that's about it). Money and the financial system will go away due to the robots that today take care of many routine needs in most households. Machine perception is remarkably capable and nearly what's needed for a general servant. Actuators have improved significantly, but remain expensive.

However, the precision machining needed by robotic actuators will cost exactly nothing once we embed enough intelligence into these nearly-thinking machines so they can manufacture themselves. When robots mine and refine materials and then build copies of themselves the concept of wages will disappear. Need something? A robot, which has no materials or labor cost, will make it for you. What value will money have in such a society? This is a hopeful, yet scary, development that a philosopher QCE will have to ponder. Star Trek, still in syndication, predicted a society where people work only for personal development. I see this as a hopeful development which will twist every aspect of society. Consider the effect on the MPAA and RIAA when artists produce music merely for fun!

The Malthusian catastrophe predicted by many over population growth has failed to materialize, but it's mathematically certain that continued growth is unsustainable. But the integration of electronics and bio will fuel even more life extension - and hence competition for space and resources - than the miracles of modern medicine have accomplished. Perhaps some force will cause couples to have fewer or no children, but a world without masses of children seems barren indeed. It's hard to imagine humans changing so profoundly that the biological imperative of reproduction and raising youngsters to adulthood would disappear. Perhaps the embedded technology that enables robots will support the need to nurture. Even decades ago some people formed attachments to primitive robots - toys, really - that emulated a few canned responses. Certainly, improving technology implies more realistic, and thus more compelling, pseudo-kids. Will the enduring human yearning for offspring be redirected towards electronic facsimiles as regulatory or other influences reduce procreative opportunities?

Much of this speculation sounds like science fiction, but today's mundane technology was not even dreamed of twenty years ago. From here in 2028 I see wonderful opportunities for the future as well as terrifying challenges. But two hundred years ago newspapers warned readers of the dangers of train travel at 10 MPH. The future always looks a bit scary.

Happily, we embedded engineers will have a hand in creating it.