What will tech come up with next? E-mail Tweet Facebook Google Plus Linkedin Share icons by Michael S. Malone @FortuneMagazine July 17, 2014, 11:21 AM EDT As the legend goes, in 1964 Dr. Gordon Moore, then at Fairchild Semiconductor, was preparing a paper for Electronics magazine on the evolution of semiconductor memory chips. He decided to plot the capacity of those chips, versus their year of introduction, on some graph paper. There were only a half-dozen or so data points, as memory chips at that point were less than five years old and only contained a few hundred transistors each. Connecting the dots, Moore noticed a familiar parabolic curve – shallow at the beginning and then quickly turning upwards. Unfortunately, that curve also quickly went straight off the top of the page. So Moore switched to logarithmic paper – that is, with one side in powers of ten — and, stunningly, the memory chips tracked along a straight, nearly horizontal line. Moore, one of the most brilliant individuals in Silicon Valley history (and future Intel INTC co-founder), not only knew what this said, but more important, what it meant. What it said was that semiconductor memory was progressing at a pace never before seen in any product in human history – and if that pace could be maintained the generational leaps would soon be gigantic. This trajectory – at first defined as the doubling of the performance of semiconductor chips every couple years – became known as Moore’s Law. But what Moore’s Law meant was that for the first time, perhaps in any industry anywhere, there was now a map into the future. You could track that line decades out into the future – and know exactly what memory chips would be like on any date. And that meant you could plan for that date, and you could build for it. It was a magic key to competitive success. Moore’s Law quickly spread from memory to logic chips and then to the rest of the semiconductor industry – and quickly made the chip business the fastest growing industry. And soon, the most valuable. What no one, not even Moore himself, saw coming was that, by the 1980s and 1990s, with tens of billions of chips out in the world, Moore’s Law would break out of electronics and into the rest of the economy. From automotive to infrastructure to genetic research to telephony – companies, laboratories and government agencies discovered that if they could find any way to hook up to Moore’s Law they too could experience exponential growth. One result was the great transformational technology of our time, the Internet. In writing my new book on the history of Intel Corporation, The Intel Trinity, I became convinced that we have made a serious mistake being so comfortable with that shallow line. And that mistake begins with Gordon Moore’s change of graph paper. That’s because behind the gently sloping straight line there still lies that dizzying parabolic curve. It is this reality that has been largely forgotten over the last few decades. What lies in that steep arc? Like all parabolic curve, it begins deceptively flat: for the first 40 years, Moore’s Law is a gentle grade. Yet under that comparatively flat curve can be found the minicomputer, the microprocessor, the digital calculator, computer gaming, the personal computer, the Internet, robotics, wireless telephony, the smart phone and electronic commerce – in other words, our world has been utterly transformed by just the shallowest section of this curve. But then, about 2005, roughly the time the newest chips reached 1 billion transistors on their little squares of silicon, everything changed. Suddenly the great accumulating leaps caused by the biannual doubling of Moore’s Law began to turn the curve nearly straight up, heading toward infinity – and tens of billions of transistors on each chip. In other words, Moore’s Law is now jumping the tech world forward each year more than the sum of all that has been accomplished since the birth of Silicon Valley. We already have glimmerings. Look at the rise of ‘exponential’ corporations like Facebook FB – the first service product in human history to reach 1 billion regular users – and Twitter TWTR . Look as well at the usage curves of the smartphone, the smart tablet, and the Cloud, the last of which essentially makes memory infinite, ubiquitous and free. All of these earthshaking new products and technologies have exploded on the scene in the last 8 years. What’s waiting in the wings? The full promise of Big Data – and the end of the 500-year age of sampling and statistics. Soon we’ll be tracking every one of our heartbeats, every fish in the sea and every gust of wind – and we will learn more about the natural world in a few decades than we have in human history. As a billion devices around the world begin to talk with each other, we will also soon be just a minor part of the “The Internet of Things,” which may be a thousand times greater than the human-oriented Internet we currently know. Further up the curve lies the nanotech revolution. Mobile health and medicine, too. Go up even further and every function of body will be measured every second of our lifetime, and nano-hunter-sensors will swim in our blood helping to hunt down cancer and other diseases. Up the curve the line between animation and reality also begins to disappear, and modeling – from new products to new worlds to new lives – become a major part of our daily existence. And it will all start with virtual sex, because in tech it always starts with sex. And then? If you believe Ray Kurzweil, the line goes vertical, we map our brains into computers and live forever. If you believe Malcolm Gladwell, then the curve will eventually taper off. But neither scenario may arrive for decades. That means that as long as Intel and other chip companies can sustain Moore’s Law we may live within the Great Inflection for the rest of our lives. And, given the announcement recently by Intel and IBM IBM of a revolutionary new type of transistor technology for chips, the odds of that occurring look better than ever. Even here in Silicon Valley we are unprepared for this new pace of change. But ready or not, the future is coming … faster than ever. Michael S. Malone is a veteran Silicon Valley-based journalist and author. He is author of The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World’s Most Important Company published by HarperBusiness.