IBM aims to replace silicon transistors with carbon nanotubes to keep up with Moore’s Law
IBM has developed a way that could help the semiconductor industry continue to make ever more dense chips that are both faster and more power efficient. Strap on your science hats while I explain what IBM has accomplished, and why it matters, because this is a significant step in keeping the information technology industry humming along.
IBM (IBM) researchers have figured out how to move electrons on a carbon nanotube, a structure that is 10,000 times smaller than a human hair and an awesome conductor of electricity. IBM’s breakthrough on Thursday is that it has figured out a way to atomically bond a specific type of metal to a carbon nanotube to create an incredibly tiny contact point needed to move electrons through the carbon nanotube without affecting the performance of the chip. This is a crucial step that should one day let researchers replace silicon transistors with carbon nanotubes.
The research means that chip manufacturers may be able to make chips where the transistors are as close together as 3 nanometers. Currently the most advanced chips are between 14 and 11 nanometers apart, but moving them closer is becoming tough to contemplate. Earlier this year IBM announced with much fanfare that it had made a chip where the transistors were only 7 nanometers apart.
The foundation of our information economy is built on one simple fact. Our computing and storage has gotten cheaper pretty much every two years thanks to the semiconductor manufacturing industry’s ability to cram more transistors on a single chip. By virtue of this trend, known as Moore’s Law, our computers have gotten faster and the amount of stuff we can store on them has increased exponentially.
It’s why our smartphones are more powerful than a supercomputer from 1985 and why we can buy a terabyte of storage for about $50 today. But this progress threatens to grind to a halt because we are reaching the physical limits of manufacturing these ultra-dense semiconductors. As we force the transistors closer together, they tend to start leaking electrons. To solve the leakage problem requires expensive and complicated new manufacturing techniques that the industry spends billions on.
During an investor call earlier this year, Intel’s (INTC) CEO Brian Krzanich admitted that the chip giant, whose co-founder Gordon Moore created Moore’s Law, would not be able to keep up with the pace set by Moore’s Law because it was becoming more difficult to shrink the space between transistors on the chips.
That’s what IBM’s research hopes to help solve. Dario Gil, VP of Science & Technology at IBM Research, tried to convey the importance of this research effort. “The value to IT and users of IT that performance scaling and density has enabled is unprecedented and without equal in the history of mankind,” he said. “To the extent that we can continue pushing that to its limits, we should.”
That’s one reason IBM is spending $3 billion on basic semiconductor research, but it also makes sense. We’re still producing more data that needs to be stored and computed, so if we suddenly reach a plateau in terms of the cost of computing and storage we’re going to also reach a plateau in terms of the innovation curve that relies on processing and analyzing the data we’re generating and bring in from ever more connected devices. Our information economy is built on the falling cost of computing and storage, so if that suddenly changed the economic impact would undoubtedly be huge. You could probably start by saying goodbye to the continuing Amazon Web Service price decreases and the annual feature upgrades in your iPhones.
Subscribe to Data Sheet, Fortune’s daily newsletter on the business of technology.