Last week’s stellar jobs report notwithstanding, it’s easy to understand why so many people are anxious that they may find themselves out of work before long.

Oxford researchers suggest that nearly half of the occupations in the U.S. will be computerized over the next 20 years. Gartner predicts that one in three jobs will be converted to software, robots, and smart machines within a decade. And a McKinsey & Co. analysis finds that “as many as 45% of the activities individuals are paid to perform can be automated by adapting currently demonstrated technologies.”

All of which is why I’ve been so intrigued by Robert Cohen, a senior fellow at the Economic Strategy Institute, who is as sanguine about the future of labor as anyone I know.

Cohen not only sees the glass as half full; he sees it brimming over, thanks to three major trends:

  • First, more and more companies, including many old-line manufacturers, are moving to offer services — sometimes pushed there by upstart rivals.
  • Second, there is a need for new networks to handle sensor data from driverless cars and wearable devices.
  • Third, the increasingly rapid development and deployment of software and applications is feeding a surge of data analytics.

With this in mind, Cohen says, “cloud computing, Big Data, and the Internet of Things will employ millions of people in new types of jobs.”

More precisely, Cohen figures that as a new “virtualized infrastructure” gets built out over the next 15 years, as many as 25 million jobs will be created. He acknowledges that automation is certain to wipe out a bunch of positions, but he estimates that the net gain will still be around 15 million.

“It’s almost like building out the road system or railroad system,” says Cohen, who will present his views later this month at the Innovation for Jobs Ecosystem Summit in Menlo Park, Calif. “Now we’re basically building the superhighway for data. It will mean replacing old generations of computers with new ones.”

Cohen arrived at his forecast in two ways: by extrapolating from expected growth in the gross domestic product and by poking around at companies on the cutting edge of the trends he’s citing, including Netflix nflx , Google googl , Amazon amzn , and Facebook fb .

What has caught his eye is how many other companies are suddenly trying to be like them. Ford f , for instance, is partnering with Amazon and home-automation company Wink to allow people to control lights, thermostats, security systems, and other features of their houses from the driver’s seat of their cars. Boeing ba is collecting data from sensors and mobile devices to provide its airline customers with real-time insights into how to better fly their planes and manage their fleets. Banks, insurers, hospitals, and pharmaceutical companies are all heading down similar paths as they seek to remain competitive, Cohen notes.

“It’s very impressive to see how many different types of companies, in how many different sectors, are beginning to operate in these ways,” he says. “This service orientation is going to ripple through every area.”

As the business world transforms itself, Cohen believes that there will be especially high demand for three types of workers: computer programmers, data analysts, and those who design, make and install all sorts of sensors across the commercial landscape—a process, Cohen says, that “will require several stages of rebuilding to add more capabilities.”

Importantly, Cohen doesn’t think that the only ones poised to land good jobs in this new “software age” are the highly educated or highly skilled—a decidedly contrary assessment to those who maintain that all too many folks are destined for “gig economy” work that lacks security, benefits, and a chance for advancement.

“Substantial numbers” of managerial, marketing, manufacturing, cybersecurity, and support roles will be required as more “programmable enterprises” take shape, Cohen says, adding that “there will be whole new categories of jobs for people who lack formal degrees.”

Indeed, while some worry that the U.S. isn’t doing enough to train workers for the new economy, Cohen is optimistic here, as well. He is confident that community colleges, coding academies, and nonprofit organizations such as Girls Who Code will begin to supply talent and demonstrate mechanisms that companies can tap to fill openings. “We’re going to change our assumptions about how people access jobs,” Cohen says.

At the risk of succumbing to some Luddite fallacy, I must confess that I’m not terribly convinced by the case Cohen makes. Yes, new technologies have historically generated more jobs than they’ve killed off. But these days, it really seems to me like something is different.

The speed with which entire industries are likely to be upended is unprecedented; imagine, for instance, what autonomous vehicles are going to do to those driving trucks, taxis, and more. In addition, higher productivity—which is often a result of new technologies being introduced—used to go hand in hand with job creation. But that link is now broken.

At the least, the labor market may well be subject to “greater disruptions” than in the past, as Erik Brynjolfsson and Andrew McAfee write in their book, The Second Machine Age.

Yet Cohen is unmoved, relishing his place as the positive provocateur. “Ninety-nine out of 100 people will tell you that technology is going to destroy jobs,” he says. “The media has almost made it acceptable that we’re not going to have job growth in the future. The argument has become so dominant, it’s almost like a reflex reaction.”

Only time will tell who is right—worriers like me or Bob Cohen. Frankly, I hope it’s him.

Rick Wartzman is the executive director of the Drucker Institute at Claremont Graduate University. The author or editor of five books, he is currently writing a narrative history of how the social contract between employer and employee in America has changed since the end of World War II.