The world is awash in claims of the benefits of truthfulness, candor, and transparency. A Google search using the phrase “benefits of candor” returned 30,500 entries, with just six for the opposite phrase, “costs of candor.” The kumbaya nature of leadership advice shows through.
But before you run off and tell everyone precisely what you are thinking and feeling, here are a few pieces of evidence in favor of the opposite approach.
About 50 years ago, a Harvard social psychologist and a San Francisco school principal studied Pygmalion effects in the classroom. They found that students who had been labeled, on the basis of fictitious test results, likely to experience spurts in intellectual growth showed increases in measured IQ over the course of the school year. The effect was particularly pronounced for children in the first and second grades. This research led to a boom in similar studies, first in education and then in management and leadership.
An Israeli academic, Dov Eden, conducted a number of studies demonstrating that when leaders communicated high expectations for individuals ranging from sales people to military personnel, those individuals performed at a higher level than people not subjected to similarly high expectations. A subsequent systematic review of the scientific literature confirmed the effects of expectations on performance and found that the effects were more pronounced for people who had previously been poor performers.
There are at least two mechanisms by which expectations have an effect on a person’s performance. One is called defensive effort. People who are told they won’t do well will, reasonably enough, not try very hard. Why waste energy on a fruitless quest? On the other hand, people who are told they are likely to succeed will invest more time and energy because they expect a payoff from their efforts.
Second, people, including teachers and supervisors, behave differently toward people depending on what they are told about those people. One article noted that when a person is provided with stereotype-cuing information about another individual with whom they expect to interact—for instance information about physical attractiveness, intelligence, and so forth—their behavior changes in ways that act to confirm the stereotype. For instance, people who thought they were interacting with a physically attractive person were more sociable, friendly, and likable than those who thought they were interacting with a less attractive individual.
In many cases, for positive expectations to improve performance, leaders or teachers must deliver false or bogus information to the targets. If poor performers are going to improve because they are told they are expected to do great, leaders may have to say things they may not believe.
A related phenomenon in medicine is the placebo effect—people who believe they have been given some drug or treatment will react more just because they think they received a potent treatment. For instance, a study of the administration of a stimulant (not a placebo) to cocaine abusers found that the physiological metabolic response was some 50% higher in people who were told they were being given the stimulant compared to people who received the identical dosage but were told they were being given a placebo.
A recent article in the New England Journal of Medicine noted that the therapeutic encounter—the doctor in the white coat, the other symbols and settings of medicine, and the apparent administration of some treatment—activated certain parts of the brain and affected patients’ levels of endorphins and dopamine. The article argued that some of these effects on neurotransmitters were identical to what was achieved when patients took actual drugs.
The potency of the placebo effect coupled with the tremendous contemporary problem of opiate addiction has led to the recommendation to sometimes use “fake” pills to treat patients’ pain. The idea is to achieve pain relief without the administration (and availability) of addictive narcotics.
Once again, for the placebo effect to work, there must be deception. If someone says you are getting a sugar pill, the placebo effect won’t operate and there will be no benefit to the patient.
Placebo and expectation effects are examples of self-fulfilling prophecies—the concept that a certain idea produces behaviors that make the idea, even if originally false, become true. The classic example would be a run on a bank. If people believe a bank is on the verge of failing, they will rush to get their money out, which then causes the bank to fail.
For businesses to succeed, they need the support of investors, the purchases of customers, and the talent and energy of employees. But none of these parties will want to be associated with a company that is going to fail. So, one of the most important tasks of a leader is to convince others that the organization can and will be successful and that it deserves their support. Leaders who convincingly display confidence can attract the support that makes the confident posture become true, as the company becomes successful because others believe it will be and act on that basis.
Sometimes, as Intel co-founder and former CEO Andy Grove once told a Harvard Business School conference in the San Francisco Bay Area, this requires leaders to display confidence that they may not feel and to act as if they know what they are doing even if they don’t.
As quoted in a book I wrote with Bob Sutton, Grove argued that leaders needed to use deception to create the conditions for success: “Part of it is self-discipline and part of it is deception. And the deception becomes reality. Deception in the sense that you pump yourself up and put a better face on things than you start off feeling. But after a while, if you act confident, you become more confident. So the deception become less of a deception.”
Grove also emphasized that leaders should not display uncertainty and insecurity, even if, to quote him again, “none of us have a real understanding of where we are heading.”
Forget for a moment the self-interested benefits that may come to people who deceive others for their own advantage. Suppose leaders have the purest of intentions and just want other people to succeed to fulfill the lofty expectations others may have of them. Or maybe leaders want their organizations to succeed because success inspires others to put in more effort and stay at the company. Or perhaps doctors want to improve treatment outcomes by tapping into the placebo effect.
In all of these instances, people need to be able to convincingly prevaricate—which is one reason I sometimes say that the ability to lie convincingly may be the single most important management skill. Simply put, many situations in management—and medicine—rely on the operation of the self-fulfilling prophecy. The sooner we recognize this and incorporate it into leadership training, the better off we will be.
Jeffrey Pfeffer is the Thomas D. Dee II Professor of Organizational Behavior at the Graduate School of Business, Stanford University, and author of Leadership BS: Fixing Workplaces and Careers One Truth at a Time.