Silicon Valley’s New Parlor Game? The ‘Last Job’

June 24, 2016, 3:37 PM UTC
Operations At The Honda Manufacturing Of Alabama Facility As First Quarter Income Rises
Sparks fly as robots weld the frames of Honda Motor Co. Odyssey minivans on the assembly line at Honda Manufacturing of Alabama LLC facility in Lincoln, Alabama, U.S., on Tuesday, August 12, 2014. Honda Motor Co., Japan's third-largest carmaker, raised its profit forecast to the highest in seven years as new models boost sales in emerging markets. Photographer: Luke Sharrett/Bloomberg via Getty Images
Luke Sharrett/Bloomberg/Getty Images

A decade ago, vampires scared us. A few years ago, zombies did. But the last year or two has been all about the robots and—in the business world—worries over the future of work.

Mainstream publications created dedicated sections devoted to this worry. New conferences emerged. The Rise of the Robots by Martin Ford was the business book of last year. The 2015 film Ex Machina satisfied one taste (evil, beautiful robot) while its 2013 antecedent Her satisfied another (loving, invisible robot).

Why the sudden interest? After all, these waves have come before. The Luddites. The book The End of Work, published 20 years ago. The (mostly correct) concerns about the withering of American manufacturing. What’s different this time around?

There are background trends. An economic recovery that brought few jobs and raised income inequality. A fragmentation of careers into the gigging of the “sharing economy.” The first signs of “magic-feeling” artificial intelligence in our regular life (right, Alexa?). In most peoples’ lives, the closest they got to AI was spell-check. Now Tesla (TSLA) has a car with a learner’s permit, smartphones talk back to us, and cloud Barbie (MAT) is on the way.

What’s different now is that the jobs threatened are ours—highly educated “knowledge workers,” the very people who reaped the gains of technology for the last five decades. For the first time, the chattering class might get out-chattered by the bots. So we’re paying attention and worrying. Though we might find something better around the corner.

Almost every day I find myself in a conversation with someone in the technology industry that ends with an absurd guessing game: As the machines start doing all the work, what will be “the last job” left for us humans?

A judge presiding over a jury? A therapist? A nurse? Will one of us be like Norman Muller, the “voter of the year” in an Isaac Asimov story who supplies the last needed input for the machines to elect the president by opining on the price of eggs?

Last year, Fortune senior editor-at-large Geoff Colvin thoughtfully wrote “Humans Are Underrated” to make the case that we humans will be safe for some time. His core arguments were heartening: We’ll be in charge, we’ll choose which problems to solve, and we’ll need (given our human nature) to care for one another along the way.

I’d love to believe he’s correct, but there’s plenty of evidence to suggest that human beings are more than willing to accept care from a machine. And will we really be in charge? How do we feel when Waze (GOOG) surprises us with a new instruction to exit the freeway early, or when a factory shuts down production because of a bug in the code?

Others think the answer to the threat of human redundancy is novelty—humans are the only machines capable of solving new problems, computer-machines can only do what they are told. That feels like a bedtime story we want to tell ourselves. In fact, we keep moving the goal post of what’s considered novel. Is knowing how to navigate from Boston to New York novel? Is diagnosing heart failure novel?

Get Data Sheet, Fortune’s technology newsletter.

Maybe the last job will be “AI trainer,” as people supply machines the data they need to mimic and, eventually, outdo us. Or maybe, as users of Google (GOOG) and Facebook (FB), we are all already AI trainers.

Every skill that looks novel will, at some point, become a routine. Last month’s surprise achievement of the machines was that they can outperform a human cardiologist at assessing MRIs. NIH, Booz Allen, and Kaggle—one of our portfolio companies—announced the invention of an algorithm that does this. The pace of machine intelligence surprises, and seems to be increasing.

Maybe the answer to our worries is that we’ll just invent more new things to do, as we humans have aways done. Of course this is likely true in the long run. People want to be productive. We transitioned from a tribal society to an agrarian one over a few millennia, a farming society to an industrial one over a few hundred years, an industrial to a service economy in a few decades.

The issue is the speed of change. We can’t quite predict what’s next, though we know when it comes, it might be an avalanche of change. (Can we get there without a debilitating crisis when professions become automated all at once, much as 3.5 million truck drivers have to reckon with self-driving trucks? That’s a hard question.)

All of this machine intelligence will only take us so far: The computer-machines will have their goals, we human-machines will have ours. The computer-machines may create most of our GDP output. The work that will remain the domain of humans—forever—is that where the buyer is intentionally buying something because it was made by a human being.

That could be an exciting future. We already have a word for this in some industries: handmade. Services like Kickstarter already appeal to the part of us that wants to buy more than a product or a service or a work of art—we want to buy the story that it was made in a certain way, by a certain person. (Set aside the risk that machines may counterfeit things made by humans in shockingly good ways, and we might need better proof that something was handmade than a sewn-on assertion.)

For more, read:
Memo to CEOs: You Can’t Predict the Future of Work

The made-by-humans aspect is intrinsic to what makes some things valuable to us. We already begin to resent many mass-produced goods. TV dinners went from a curiosity to a staple to an object of scorn. Perhaps we’ll one day value human-given therapy over the soon-to-come robot kind. (Or jury verdicts by humans, even if we know the machines are less guilty of bias.)

So people will trade with each other the things we make for each other. Culture, generally. Artisanal everything, like this wine. The corner café run by an old friend. Local productions of Hamilton. Beautiful handwritten letters— if actually written by a human hand, that is. We might see a resurgence of fondness for culture in its smallest forms; the traveling bard might return as a 21st century profession. The growth of Etsy (ETSY) signals that this may already be happening, as does the rise in demand for locally-produced food, and our collective obsession with media about cooking.

We humans will have our “human corner,” where we’ll buy from one another because it was made by one of us. (Of course we’ll often use the machines to help make all this humanstuff, the way an author who writes a story types on a machine—but it’ll still be demanded because it was made by a person.)

If we make the transition to a world that looks more like this one without calamity, this human corner economy could offer a wonderful future for us. We might put many of our side projects of today (crafting, or being in a choir) on the same level of importance as the things we list on our LinkedIn (LNKD) profiles. We might take pride in the things we make, not the awards we pin on ourselves. Sooner than that, I hope we’ll find better ways to recognize valuable but unpaid work like child- and eldercare.

Will there be a living wage in the human corner? How will we find meaning in work? What new mechanisms will feed, clothe, house, entertain, and inform us? That’s what we have time now to figure out.

Roy Bahat (@roybahat) is the head of Bloomberg Beta, a venture fund backed by Bloomberg L.P.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward