Illustration by Sinelab
By Roger Lowenstein
March 16, 2017

I know an investor who anthropomorphizes the stock market, as if the Dow were a living beast. “Wouldn’t it be just like the market to draw us in with a rally, then deliver a sucker punch?” he will say, as if “the market” were a canny three-card monte player, entrapping tourists in Times Square.

Irrational as that may sound, it’s no more so than the ever-popular investing method known as “technical analysis.” This is a branch of astrology—­chart ­interpretation—that purportedly “reads” the market, even though it tells you nothing about the under­lying companies. You see a chart of zigs and zags and impute to it a story—the market is “exhausted,” or “bottoming,” or whatever.

Many investors are addicted to such fruitless techniques. Behavioral economics, which studies the impact of psychology on financial decision-making, explains why. It reflects what author Nassim Nicholas Taleb called the “narrative fallacy.” It’s the human tendency, actually the human need, to impose order on random events and to make events we didn’t anticipate seem “predictable” after the fact.

The narrative fallacy explains many common financial errors. The news is that it’s equally useful for understanding today’s conspiracy politics. It’s why, rather than accept that John F. Kennedy was killed by a lone, random gunman, we invented conspiracy theories—to spare us the ontological despair of acknowledging that chaos, not order, rules the universe.

Taleb’s The Black Swan, published in 2007, became a handbook for market traders—a reminder that random, extreme events like economic crashes overtake the best of plans. They also overtake individuals—those who happened to be in the wrong place during a natural disaster or who worked in a once-healthy industry, such as journalism, that was undercut by a new technology.

Randomness has a subtler significance for ordinary business. It delivers extreme results for some firms and human endeavors but not for others. Life insurance and pensions are predictable businesses because the randomness is contained—average life spans aren’t going to suddenly change by much. But given the unpredictability of hurricanes, a property and casualty insurer concentrated in South Florida should prepare for the possibility of very big variations in claims.

People have a tough time accepting that many realms are beyond the power to forecast. Even retrospectively, many occurrences cannot be explained. In general, news events and historical developments are less predictable than people suppose. No one could have forecast Sept. 11, 2001, with any useful specificity—nor could economists foresee in 1970 that wealth distribution was about to skew, the rich were about to get richer, and middle America was going to stagnate. The number of variables is simply too great. The important variables (say, the potential of a new industry called software) might not even be recognized.


It’s possible to avoid falling for the narrative fallacy. If you’re an economist, this means trying not to superimpose, say, a pattern from the last economic cycle on the future. If you’re in business, assessing diffuse data to figure out whether to go ahead with an investment, it means resisting the temptation to see a coherent story in that data. Reality is messy. If the investment works, it will have to work in a future that abides a wide range of ­environments—recessions, new competitors, wars.

Investors, in particular, should be wary of falling in love with a stock’s story, after which every data point is seen as confirmation. Securities analysis is not an exercise in metaphysical truth seeking. It’s a prosaic business of buying stocks at a discount. But judgments can be wrong, or random stuff can make them wrong.

In the broader world, refusing to accept that stuff can go randomly wrong can lead to frustration, then to blame. A reason is sought.

Consider a pair of benign examples. The first is from The Black Swan. Taleb offers the sentence “The king died, and the queen died.” It is unmemorable. Then a slight refinement: “The king died, and then the queen died of grief.” Voilà! We have narrative; what’s more, we have the satisfaction of causation. No need to wonder why the queen died—it was from grief.

The second is from Michael Lewis’s new book, The Undoing Project. Lewis describes an experiment conducted by the Israeli psychologists Daniel Kahneman and Amos Tversky, pioneers in behavioral economics. For the purpose of quizzing unknowing test subjects, the two invented ­“Linda.” They cast her as a type: “31 years old, single, outspoken and very bright … deeply concerned with issues of social justice.”

They then asked subjects which was more likely—(a) “Linda is a bank teller” or (b) “Linda is a bank teller and is active in the feminist movement.” Since feminist bank-teller Lindas are a subset of all bank-teller Lindas, the correct answer is (a). But 85% of the subjects said (b)! People find it easier to engage with a feminist Linda who cares about social justice. This Linda adheres to a preconceived type; she bows to the narrative fallacy.

Kahneman and Tversky, in Lewis’s words, were asking how “a person’s understanding of what he sees change(s) with the context in which he sees it.” Taleb supplies an answer, also anticipated by the psychologists: People need narrative.

It’s striking how similar this is to the language of paranoia and conspiracy. The idea of “paranoia” as a political tendency was introduced by historian Richard Hofstadter. In “The Paranoid Style in American Politics,” an essay that appeared in Harper’s Magazine a year after the Kennedy assassination, Hofstadter traced the history of conspiracy theory in American life. A politically paranoid person, Hofstadter wrote, “is always manning the barricades of civilization,” engaged (so he believes) in a life-and-death struggle against an “amoral superman” endowed with magical, malevolent powers: “He makes crises, starts runs on banks, causes depressions, manufactures disasters.”

Although Hofstadter was writing about McCarthyism, his words apply equally to contemporary Newtown school-shooting deniers, vaccine paranoids, and the sundry confabulations (birtherism, millions of illegal voters) echoed in Donald Trump’s speeches. From the distance of half a century, one phrase leaps off the page: “The paranoid mind is far more coherent than the real world.”

The Boston Globe recently talked to a supporter of the President in Bellevue, Ohio, who lamented the disappearance of mom-and-pop stores. “You can thank the government for that,” he said. Notice how his chosen culprit supplies coherence: a simple explanation for a numbingly complex issue.

Hofstadter’s “coherence” is the narrative fallacy, harnessed to the saddle of political anxiety. Behavioral economists have shown us why we’re prone to think that way. Their insights go far beyond economics, to politics and ordinary life. Narrative fulfills and satisfies. It answers a human need. The queen died of grief. If only we knew it.

A version of this article appears in the March 15, 2017 issue of Fortune with the headline “The Stories We Fall For.”

We’ve included affiliate links in this article. Click here to learn what those are.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST