Thinking fast and slow

November 11, 2011, 3:00 PM UTC

Our Weekly Read column features Fortune staffers’ and contributors’ takes on recently published books about the business world and beyond. We’ve invited the entire Fortune family — from our writers and editors to our photo editors and designers — to weigh in on books of their choosing based on their individual tastes or curiosities. In this installment, senior editor Jennifer Reingold takes a look at Thinking Fast and Slow, Daniel Kahneman’s tour of the human mind and how it makes decisions.

FORTUNE — Making your way through Daniel Kahneman’s astonishing book, Thinking Fast and Slow, is something of an exercise in frustration.

The first level of frustration comes from the sheer amount of work he makes you do. Though he is an exceptionally clear writer — particularly for an academic — virtually every chapter opens with a problem that at first seems obvious and then, in a cruel twist, turns out to be proof that your thought process is guided not by rational thinking but by a kind of lazy overseer who’d rather take the easy path than work up a sweat to get to the right one.

A Nobel Prize-winning Princeton psychologist who did much of his work with cognitive science pioneer Amos Tversky (now deceased), Kahneman describes our process of thinking as a combination of two “Systems” that control most of our behavior: “System One,” who, were it a person, would be a mashup of a robot and Archie Bunker — lightning fast, yet quick to jump to preformed conclusions; and “System Two,” a sort of smarter Kato Kaelin. System Two is the override, what you use when having to do something that requires more effort, be it walking faster or solving a complex math problem; at the same time, it wants to conserve that energy and to do as little work as possible to get there.

Here’s the next level of frustration: Both systems, as amazing as they are, often work together to lead you to make the wrong decisions, almost as much as they do the right ones. The brain works so efficiently in part because it is able to form nearly instant judgments based on past experiences. Yet those biases and assumptions are also just as likely to lead us astray, even as we firmly believe we are using our “rational” mind.

Take the example of Linda, a fictitious character described in Kahneman’s research as “thirty-one years old, single, outspoken and very bright … as a student, she was deeply concerned with issues of discrimination and social justice.” Participants in a study were asked to choose from a list of choices of her profession today, including: “Linda is a bank teller” and “Linda is a bank teller and is active in the feminist movement.” Despite the fact that there are inarguably fewer bra-burning bank tellers than bank tellers as a whole, participants consistently ranked the likelihood of Linda being one of them as more likely than not — a clear example of assumptions proving more powerful than statistics.

Ultimately, the element of Kahneman’s book that is the most frustrating is how hard it hits at someone like me, a journalist whose perceived “value added” is the ability to accurately assess people and situations. In just a few chapters, Kahneman demonstrates pretty persuasively that those of us in the analysis business — be they traders, therapists, economic forecasters, or CEOs — are, at the end of the day, essentially superfluous. If you want to know why no one predicted the Arab spring or the financial meltdown, Kahneman has a simple, but distressing answer — it’s because we never really had the ability to make a truly analytical assessment; our System One and System Two prevented that from ever happening.

To the facts, we add things like, for example, a “halo” effect, in which we give a seemingly successful leader the benefit of the doubt when things go wrong, because he once was so right. Our desire for a good narrative leads us to omit the power of luck; Kahneman notes that Larry Page and Sergey Brin were willing to sell Google (GOOG) for less than $1 million a year after its founding, but the buyer said the price was too high — certainly a lucky thing rather than a brilliant thing. Yet everything Google’s execs do today is likely to be seen within the rubric of brilliant businessmen.

When it comes to things like stock picking, argues Kahneman, our biases and assumptions influence our decisions so much that we’d do better to let a chicken peck a random course of action than use all of our “tools” to get there. Even the companies identified by Jim Collins in Built to Last don’t make it though Kahneman’s process. “On average,” he writes, “the gap in corporate profitability and stock returns between the outstanding firms and the less successful firms studied in Built to Last shrank to almost nothing in the period following the study … the average gap must shrink because the original gap was due in good part to luck, which contributed both to the success of top firms and to the lagging performance of the rest.” Regression to the mean.

Is there any hope? Can we retrain our Systems One and Two to abandon their biases and think like we think we think? Kahneman says it’s possible — sometimes. “The way to block errors that originate in System 1 is simple in principle,” he writes, “recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2.” There’s only one problem, one that those of us who have recently had to make a tough decision under pressure know (or think we know): “Unfortunately, this sensible procedure is least likely to be applied when it is needed most.” Advancements like medical checklists, for example, offer improvements to relying on intuitive decision-making. The key, says Kahneman, is recognizing just how fallible — albeit miraculous — our brain actually is. And that makes this book well worth reading. At least, that’s what I think I think.