For novel ideas about building embedded systems (both hardware and firmware), join the 40,000+ engineers who subscribe to The Embedded Muse, a free biweekly newsletter. The Muse has no hype and no vendor PR. Click here to subscribe.

The Demise of Moore's Law?

Summary: Moore's Law has driven electronics for 50 years, and shows no sign of abating.

Last month (April, 2015) was the 50th anniversary of Moore's Law.

No one seems to be sure what Moore's Law is. Moore originally predicted the number of transistors on a chip would double every year; later he amended that to two years. Some now claim the law sets an 18 month rate. Others say the law predicts that system performance will double every 18 months (or two years, depending on who is making the claim).

The mainstream press is clueless about the "law." Tom Friedman, of The World is Flat fame, seems to think it really is a law, instead of an observation and an aspiration. Gordon Moore's 1965 paper showed it to be a historical pattern rather than a law of nature. And it's an aspiration as leading semiconductor vendors use it as a guide for their products. For instance, Intel plans for a "tick" (process shrink) or a "tock" (new microarchitecture) every 12-18 months.

I can't find what process geometry was used in 1965 when Moore made his prediction, but remember that few ICs had more than dozens of transistors at the time. I thought maybe the biggest part of the time was the 74181 ALU, which implemented about 75 gates, but it appears (http://ygg-it.tripod.com/id1.html) that this part didn't come out till five years later. Given that we've gone from a handful of transistors to billions on an IC, Moore's prediction certainly was remarkable and prescient.

Is it coming to an end? For 30 years naysayers have been claiming that process scaling is doomed, but the incredible engineers designing these parts continue to astound.

Gordon Moore thinks the law will peter out in the next decade or so (http://spectrum.ieee.org/computing/hardware/gordon-moore-the-man-whose-name-means-progress). Intel thinks it's good for at least another decade (http://www.v3.co.uk/v3-uk/news/2403113/intel-predicts-moores-law-to-last-another-10-years). Yet in 2003 Intel predicted (http://news.cnet.com/2100-1008-5112061.html) an end to the law shortly after hitting the anticipated 16 nm node. Today it seems we're on track for 10 nm fairly soon, with 7 not far off. (There is some marketing fuzz in the definition of "node," and it has been a long time since that term had much to do with the size of a transistor).

In a sense Moore's Law, or at least many of the benefits it gives, ended years ago around the 90 nm node when Dennard scaling fell apart. In 1974 Robert Dennard noted that as geometries shrink we get all sorts of goodness, like higher clock rates and lower power. Many of the benefits he described no longer come with increasing densities. Today the law does give us more transistors per unit area, which translates into better performance and a lower cost-per-gate. However, some believe (http://www.edn.com/electronics-blogs/looking---electronics/4376763/The--Scale-of-Things--Nanometers-versus--Giga-Bucks-----Tera-Bucks-?cid=EDNToday) that 20 nm broke even the cost model.

A silicon atom is about 0.25 nm in diameter. Gate oxide thicknesses are measured in a handful of atoms. The scales are truly amazing, and to think of us manipulating things on an atomic scale is breathtaking. The technology is astounding. And yet it's commonplace; most everyone in the developed world has a device built with 28 nm or smaller processes.

How many transistors are in your pocket?

A few months ago I bought a 128 GB thumb drive for $45. 128 GB means a trillion bits of information. Who could resist tearing something like that apart? It has two memory chips, so each chip stores a half-trillion bits. Memory vendors don't say much about their technology, but I haven't seen MLC flash with more than three levels. Each of these chips may have over 100 billion transistors. And that assumes there's no logic for row/column addressing or other overhead. Samsung touts their V-NAND devices which go 3D, but it's frustratingly hard to get much deep technical information about those parts. Xilinx went 2.5D with their interposer arrangement for FPGAs a few years ago.

I suspect Moore's Law will have a healthy future. Memory devices are leading the way into 3D architectures. Processors might follow, though it's hard to imagine how the somewhat random interconnects CPUs require between layers will work. Interposer approaches will become more common for high-end (read "very expensive") devices, but I doubt that will scale into consumer-style parts.

Regardless, a half-century of logarithmic improvements to a technology is unprecedented in all of human history. My hat is off to all of the device engineers who have made this happen.

Published May 6, 2015