Why do governments enact, and stick with, policies that are plainly failing? Why do companies adopt foolish strategies that produce massive losses? Why do labor unions, law firms, and religious organizations make self-destructive errors?
Over the past three decades, behavioral scientists have made progress in understanding why individuals make unwise choices and why groups do not correct, and frequently even aggravate, the mistakes of their members.
The most common and devastating failure of the group process is incomplete information-sharing. As the prison guard in the classic film “Cool Hand Luke” summarized it, “What we’ve got here is a failure to communicate.” In dozens of polls of corporate executives contemplating past decisions, a top regret is the failure to extract critical information from members of a team or committee.
One major symptom of communication failure, in both businesses and governments, is group polarization, which means that, after discussion, groups end up endorsing a more extreme version of what their members thought before they even started to talk. If people initially think that their favorite presidential candidate will capture the White House, they’ll become even more committed to that belief as a result of their discussion.
When we conducted discussions of “hot button” political topics such as global warming, same-sex civil unions, and affirmative action among the citizens of two towns in Colorado, we found dramatic polarization effects. Citizens of “the People’s Republic of Boulder” started out leaning liberal and became more extreme after discussion; citizens from “the Citadel” in Colorado Springs began their conversation on the other end of the political spectrum and drifted further to the right. Needless to say, the citizens in each town came to have rather unflattering views of the positions of their counterparts from the other cities.
The same dynamic can lead businesses and public officials to be over-confident in approaches that are doomed to fail, and to show contempt to those who disagree.
The good news is that five strategies can help—and all of them can be implemented quickly.
Building a critical thinking culture
When people silence themselves in group discussion, it is often because they think that their reputations will suffer and that they will be punished, not rewarded, for disclosing information that differs from the majority's position. The problem lies in the group’s norms “to get along, go along.”
Fortunately, norms can be changed. In experiments with groups, when people are told that the goal is to “get along,” they are far less likely to disclose what they know than when they are told that the task involves critical thinking.
The point applies to many organizations, including the Department of Justice, the Central Intelligence Agency, and corporate boards. In the United States, the best-performing companies have contentious boards whose members are willing to fight with each other.
The Vanguard Group, for example, has a reputation for a culture of reasoned dissent. Fund managers like Bob Auwaerter and Mabel Yu credit this culture as contributing to Vanguard’s prudent resistance to the lure of mortgage-backed security investments in the run-up to the recent financial crisis. Yu was new in her position as a fixed-income investment analyst at the mutual fund company, but her persistent concerns about the quality of these bonds, contrary to their AAA credit ratings and apparently bottomless profits, were not suppressed at Vanguard. Rather, her dissenting views were given careful evaluation – and were ultimately vindicated.
Private investment clubs often lack a culture that welcomes dissent, and when their members are united by close social ties, they tend to lose money. By contrast, the best-performing investment clubs benefit from norms that favor sharing new information, even if it departs from the consensus.
Within businesses and governments, happy talk is common, but it can be countered with some version of, “Now tell me something I need to know, even if I don’t want to hear it.”
When leaders don't immediately show their hand
Suppose that 10 people need to decide whether to launch a new product. Suppose, too, that a majority of them have information that strongly supports the launch, but that two members have contrary information, proving that the launch will fail. Decades of studies suggest that the group is likely to go forward with the product. The reason is that groups are overly swayed by shared information; that is, information that most or all of their members already possess. Information that is initially known to only a few members will have little impact.
Leaders can do groups a service by emphasizing their desire to hear all points of view, including dissenting opinions. If senior managers are genuinely inquisitive, they are more likely to learn what they need to know. Leaders can refuse to state a firm view at the outset, which creates space for others to share their thoughts more freely. This strategy has been attributed to recent chairs of the Federal Reserve (Greenspan, Bernanke), who have been reported to poll members of their advisory committees before revealing their own preferences.
Franklin Delano Roosevelt was the master of this strategy. He gave different people the impression that he agreed with their positions, even if their views were at odds. Everyone thought the president agreed with them, so they were emboldened to develop and elaborate their opinions.
Within companies and governments, initial silence can be golden, especially from managers or senior officials.
Focusing on group, not individual, success
Group members often fail to disclose what they know because they believe they won't personally benefit from speaking up. Their own rational calculus argues in favor of silence. But organizations can change the incentives so individuals are rewarded if the group succeeds. In military teams, everyone is punished if a single member of the team fails and rewards are given at the team level, at least during training. This method can produce amazing team performance, not to mention loyalty.
People are far more likely to disclose what they know when they feel that they have everything to gain from a correct group decision. But if group members focus on their own prospects, rather than that of the group, the group is far more likely to err.
Medical decision-makers are especially likely to reach self-protective conclusions that are not in their patients’ best interests. This is because there can be a discrepancy between personal payoffs (over-testing and over-diagnosing to avoid malpractice litigation) and what benefits patients (avoiding unintended consequences of unnecessary tests and false-positive errors). Winning practices, such as those at the Mayo Clinic, structure rewards at the medical team level.
Knowing the role of roles
Imagine that a group consists of people with specific roles that are appreciated and known by all group members. One person might have medical expertise; another might be a lawyer; a third might know about public relations; a fourth might be a statistician. Such a group is likely to get the information that it needs simply because each member knows, in advance, that everyone has something special to contribute.
Several experiments support this hypothesis. In one study, each member of a group was given a lot of independent information about one of three murder suspects. In half of these groups, the specific “expertise” of each member was publicly identified to everyone before discussion began. In the other half, there was no public identification of the specialists. The bias toward shared information was dramatically reduced in the groups in which experts were publicly identified.
The lesson is clear: If a manager wants to know what group members are thinking, it helps to give people distinct roles and to tell members, before deliberation begins, that everyone has different, and relevant, information to contribute. This is also why face-to-face get-acquainted meetings are crucial, particularly for teams whose members are working in different locations.
Appoint an adversary: Red-teaming
Many groups buy into the concept of devil’s advocates, or designating one member to play a “dissenting” role. Unfortunately, evidence for the efficacy of devil’s advocates is mixed. When people know that the advocate is not sincere, the method is weak. A much better strategy involves “red-teaming.”
This is the same concept as devil’s advocacy, but amplified: In military training, red teams play an adversary role and genuinely try to defeat the primary team in a simulated mission. In another version, the red team is asked to build the strongest case against a proposal or plan. Versions of both methods are used in the military and in many government offices, including NASA’s reviews of mission plans, where the practice is sometimes called a “murder board.”
Law firms have a long-running tradition of pre-trying cases or testing arguments with the equivalent of red teams. In important cases, some law firms pay attorneys from a separate firm to develop and present a case against them. The method is especially effective in the legal world, as litigators are naturally combative and accustomed to arguing a position assigned to them by circumstance. A huge benefit of legal red teaming is that it can helpt clients understand the weaknesses of their side of a case, often leading to settlements that avoid the devastating costs of losing at trial.
One size does not fit all, and cost and feasibility issues matter. But in many cases, red teams are worth the investment. In the private and public sectors, a lot of expensive mistakes can be avoided with the use of red teams.
It is crucial for leaders to shock people out of complacency, to elicit hidden information, and to make a serious commitment to combatting happy talk. Our five approaches will make groups a bit less cheerful and a lot less likely to blunder – saving jobs, time, money, and sometimes even lives in the process.
Cass R. Sunstein is the Robert Walmsley University Professor at Harvard Law School. Reid Hastie is the Ralph and Dorothy Keller Distinguished Service Professor of Behavior Science at the University of Chicago Booth School of Business. They are co-authors of Wiser: Getting Beyond Groupthink to Make Groups Smarter (Harvard Business Review Press).