By Howard Yu
June 18, 2016

Intel has been inside computers ever since the advancement of PCs. But for the first time, the company has jumped ship to mobile, as it announced last week that Apple (AAPL) will use Intel’s modem chips for the next iPhone, ditching Qualcomm (QCOM) in the deal. But this is not a cause for celebration, even for Intel. The imminent battle among high-tech firms has been caused by a deeper industry problem: Computers will soon stop getting any faster.

It is an open secret in the IT world that the microprocessor—the very heart that goes into motion when you turn on your computer—is approaching the fundamental limit of smallness. As smaller devices are more susceptible to manufacturing errors, ever-higher levels of precision are required. The cost of manufacturing equipment thus spirals upward astronomically, and major players have been shelling out billions just to keep the game going. But when that stops—that is, when computers stop running faster and faster as in the past—all the flurry of product hype, from virtual reality headsets to Internet of Things and artificial intelligence, may well grid to a halt. Machines, after all, won’t replace humans anytime soon.

In 1965, Intel (INTC) co-founder Gordon Moore made a bold prediction about the exponential growth of computing power. He observed that the number of transistors could be doubled every two years by shrinking their size inside of a microprocessor. And since transistor density correlates with computing power, computing power correspondingly doubles every two years. Intel has since delivered on that promise and immortalized it in the name of Moore’s Law.

 

 

Take an imaginary letter-size paper. Fold it in half, then fold it a second time, and then a third. The thickness of the stack doubles exponentially every time. If you are skillful enough to fold the same piece of paper 42 times, you will have a tower that stretches to the moon. That’s why a single iPhone today can command the equivalent computing power of the entire spacecraft for the Apollo moon mission in 1969. Indeed, without Moore’s Law, there would be no Google (GOOG), no Facebook (FB), no Uber, and no Airbnb. The entire Silicon Valley would be, well, just a valley.

Those 50 years of clockwork progress are rapidly slowing.

Just four months ago, Intel disclosed in a regulatory filing that it is slowing the pace in launching new chips. Its latest transistor is down to only about 100 atoms wide. The fewer atoms composing a transistor, the harder it is to manipulate. Following the existing trajectory, by early 2020, transistors should have just 10 atoms. At that scale, electronic properties will be messed up by quantum uncertainties, making any devices hopelessly unreliable. In other words, engineers and scientists are hitting the fundamental limit of physics.

Samsung, Intel, and Microsoft (MSFT) have been shelling out $37 billion just to keep the magic going. “From an economic standpoint, Moore’s Law is over,” said Linley Gwennap, who runs an analyst firm in Silicon Valley.

Fortunately, raw computing power is not everything. Think about what happened recently in the auto industry. The Tesla (TSLA) Model S doesn’t go any faster than a Toyota (TM) Lexus, but they are very different cars, with innovations ranging from electric engines to batteries and much else. In the IT world, despite 50 years of the staggering increase of computing brawn, commensurate development in software has taken a backseat. Charles Simonyi, a computer scientist who oversaw the development of Microsoft Word and Excel, said in 2013 that software had failed to leverage the advances that have occurred in hardware. It is too tempting to rely on hardware’s brute force to mask inelegant software design.

With the departure of Moore’s Law, the semiconductor industry will no longer have a tangible road map every two years to coordinate its hundreds of computer manufacturers and suppliers. The dismantling of an existing industry order will usher in a new era where innovation is more nuanced, less structured, and increasingly complicated. Software firms will dabble in hardware; hardware makers will make niche products. Facebook and Amazon (AMZN) are already designing their own data centers, Microsoft has started making its own chips, and Intel is now jumping into mobile.

 

The 50-year saga of Moore’s Law has not only delivered a world unimaginable a few generations ago, but also taught us an important lesson about industry dynamics. Industry competition looks less like a boxing ring. The pecking order among players doesn’t change much during normal times. But when disruption occurs, like in the onslaught of digitalization or the sharing economy, new windows of opportunity open.

The end of Moore’s Law will not be the end of the IT world, but it will demand new ways to make better machines. As for the rest of us, here is the really good news: The advent of super intelligent machines that “could spell the end of the human race,” as physicist Stephen Hawking recently put it, has now been postponed.

Howard Yu is professor of strategic management and innovation at IMD. He specializes in technological innovation, strategic transformation and change management. In 2015 Professor Yu was featured in Poets & Quants as one of the Best 40 Under 40 Professors. He received his doctoral degree at Harvard Business School.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST