This year’s Detroit auto show is proving that autonomous driving is no longer a techie’s pipe dream. Even holdout Akio Toyoda has finally joined the parade. The self-driving car is coming.
But behind that development is an even more profound change: artificial intelligence (also known as “deep learning”) has gone mainstream. The autonomous driving craze is just the most visible manifestation of the fact that computers now have the capacity to look, learn and react to complex situations as well or better than humans. It’s leading to a profoundly different way of thinking about computing. Instead of writing millions of lines of code to anticipate every situation, these new applications ingest vast amounts of data, recognize patterns, and “learn” from them, much as the human brain does.
“2015 was the big year,” says NVIDIA CEO Jen-Hsun Huang, who was in New York to attend an AI conference and stopped by FORTUNE’s offices to talk. “AI has been plodding along for 30, 40, 50 years in research. All of a sudden last year, something happened.”
Huang’s company makes GPUs, or Graphics Processing Units, which power many deep-learning applications. He said two years ago, NVIDIA
was talking to only 100 companies or so that were interested in deep learning. “This year, we are working with more than 3,500.” The explosion cuts across industries: financing, medical imaging, media, and more.
Huang, of course, is not alone in seeing this inflection point. Google, Facebook
all made big investments in this area last year. IBM
is betting the company on what it calls “cognitive computing” (and is leading the industry in new patents – see story here.) Google’s
Eric Schmidt, also in New York for the AI event, said the technologies will not only transform business, but have the potential to solve some of the world’s hardest problems, including population growth, climate change, and education.
So there’s a reason to be optimistic today.
Subscribe to CEO Daily, Fortune’s daily newsletter on the top business news of the day.