Skip to content
FREE SHIPPING ON ALL DOMESTIC ORDERS $35+
FREE SHIPPING ON ALL US ORDERS $35+

The Wisest One in the Room: How You Can Benefit from Social Psychology's Most Powerful Insights

Availability:
in stock, ready to be shipped
Original price $18.99 - Original price $18.99
Original price $18.99
$19.99
$19.99 - $19.99
Current price $19.99
Renowned psychologists describe the five most useful insights from social psychology that will help make you "wise" wise about why we behave the way we do, and wise about how to use that knowledge to understand others and change ourselves for the better.When faced with a challenge, we often turn to those we trust for words of wisdom. Friends, relatives, and colleagues: someone with the best advice about how to boost sales, the most useful insights into raising children, or the sharpest take on a political issue. In The Wisest One in the Room, renowned social psychologists Thomas Gilovich and Lee Ross ask: Why? What do these people know? What are the foundations of their wisdom? And, as professors and researchers who specialize in the study of human behavior, they wonder: What general principles of human psychology are they drawing on to reach these conclusions? They find that wisdom, unlike intelligence, demands some insight into people--their hopes, fears, passions, and drives. It's true for the executive running a Fortune 500 company, the candidate seeking public office, the artist trying to create work that will speak to the ages, or the single parent trying to get a child through the tumultuous adolescent years. To be wise, they discover, one must be psych-wise when dealing with everyday challenges. In The Wisest One in the Room Gilovich and Ross show that to answer any kind of behavioral question, it is essential to understand the details--especially the hidden and subtle details--of the situational forces acting upon us. Understanding these forces is the key to becoming wiser in the way we understand the people and events we encounter, and wiser in the way we deal with the challenges that are sure to come our way. With the lessons gleaned here, you can learn the key to becoming "the wisest one in the room."

ISBN-13: 9781451677553

Media Type: Paperback

Publisher: Free Press

Publication Date: 12-20-2016

Pages: 320

Product Dimensions: 5.40(w) x 7.80(h) x 0.80(d)

Thomas Gilovich is a professor of psychology at Cornell University and author of The Wisest One in the Room (with Lee Ross), How We Know What Isn't So, Why Smart People Make Big Money Mistakes, and Social Psychology. He lives in Ithaca, New York. Lee Ross is a professor of psychology at Stanford University and co-founder of the Stanford Center on Conflict and Negotiations. He is the author of The Wisest One in the Room (with Thomas Gilovich), The Person and the Situation, and Human Inference.

Read an Excerpt

The Wisest One in the Room
In the early decades of the twentieth century, Albert Einstein dramatically challenged our understanding of the world in which we live. His revolutionary theories of special and general relativity suggested that time and space are linked in a manner best comprehended not through our subjective experience but through mathematical formulas and imaginative thought experiments. He tried to imagine, for example, what would happen if we were in a vehicle that was moving at nearly the speed of light. His famous E = mc2 formula alerted us to the amount of energy that could be produced from the conversion of matter; but the same formula, when rearranged, suggested that matter itself could be seen as condensed energy. Indeed, in one of his many frequently quoted statements, Einstein went as far as to maintain that “reality is an illusion.”

Scholars have debated exactly what he meant by that assertion. Most agree that he was alerting us to the ways in which experience is dictated by the perspective and circumstances of the perceiver. But for our purposes, the quotation serves as a reminder that what we experience in our everyday perceptions is not just a simple registering of what is “out there.” Rather, it is the product of an interaction between the strange and complex stuff that resulted from the “big bang” (the latest theory being that the stuff in question consists of vibrating strings of unimaginably tiny particles that somehow acquire mass as they interact with fields of energy) and the same stuff of which we ourselves are made. It is that interaction that produces our subjective experience of a world containing the solid three-dimensional objects we touch, the sounds we hear, the wide palette of colors we see, and the broad range of odors we detect.

Another twentieth-century genius, the comedian George Carlin, once asked his audience: “Have you ever noticed that anybody driving slower than you is an idiot, and anyone going faster than you is a maniac?” About two decades ago, the two of us began to consider the connection between Einstein’s message about reality and Carlin’s wry question. That connection, we believe, takes us to the very heart of human psychology and much of human folly. We human beings not only reflexively assume that our perceptions bear a one-to-one correspondence to reality; we often go a step further and presume that our own personal perceptions are especially accurate and objective.

To help you appreciate the nature of this objectivity illusion, let us engage in some political mind reading.

Specifically, let us show you that we can discern your political views from the mere fact that you are reading this book. We can confidently predict that:

You see yourself as being about as politically liberal as it is reasonable to be. On most issues, you see people who are to the left of you as a bit naïve, as more idealistic than realistic, and overly inclined to political correctness. At the same time, you see those who are to the right of you as rather selfish and uncaring, as somewhat narrow-minded and not fully in touch with the lives that many people live and the problems they face in today’s world.

Does this description capture the way you see yourself politically? We are confident that it does. The trick is that the political portrait we painted must apply not only to you and other readers of this book but to virtually anyone else. For if you felt that the people to the left of you were more attuned to reality than you are, you would have already moved in their direction. The same is true about people on your right.

In short, you (and everyone else) see your own political beliefs and leanings as the most realistic response to the specific times in which we live and the particular problems we face. You also see your views and positions as attuned to the realities of human nature. What’s more, given that you believe your political views are the ones most grounded in reality, it follows that those who do not share your views—especially those far removed from you on the political spectrum—are necessarily less realistic than you are. They lack your objectivity. They are more prone to seeing political matters through the prism of their ideology, self-interest, upbringing, or some other distorting influence.

Remember Carlin’s observation about your views of your fellow motorists. Your first response was likely to be, “As a matter of fact, I have noticed that about other drivers.” But after a moment’s reflection, you grasp Carlin’s point: Since you adjust your speed to what you consider appropriate to the prevailing road conditions, anyone driving more slowly must be driving too slowly, and anyone driving faster must be driving too quickly. The conviction that you see things as they truly are and those who see things differently are therefore getting something wrong is inevitable—at least as an initial reflexive response.

Everyday experience offers many examples of the same basic phenomenon. When your spouse says, “It’s freezing in here,” and turns up the thermostat, even though you feel quite comfortable, you wonder what is making your spouse feel so cold when the temperature is just fine. Conversely, when you are freezing and your spouse or someone else says the temperature is just fine, you wonder why they are so oblivious to the actual temperature. You don’t immediately consider the possibility that you are the one being overly sensitive, or insensitive, and that the other person is the one responding appropriately to the “real” room temperature.

Similarly, when you say the music is “too soft,” or “too loud,” you believe that you’re making a statement about the music and not about yourself—or, rather, not about the complex interaction between the sound output, your auditory receptors, and whatever experiences have shaped your tastes and preferences. When you claim that the food is “too spicy” or “too bland,” you believe you are noting something about the food rather than your taste buds or the cuisine of your childhood and culture. And when others disagree—when they say the music you enjoy is a lot of noise and not up to the standards of their youth, or when they question how anyone could like that food (or that art, or that style of clothing), you wonder what’s responsible for the oddity of their tastes.

To be sure, you can probably think of counterexamples: times when you conclude (typically after some reflection) that you are the one who’s anomalous. You conclude that you’re particularly sensitive to the cold because you grew up in Costa Rica. Or you think your aversion to meatloaf might have its origin in the dry and tasteless recipe you were forced to eat on your frequent visits to your grandmother. Fair enough. These exceptions are real and important, but they are just that—exceptions. They result from the tendency we all have, especially when young, to ruminate when we feel or think differently from our peers about matters of taste in things like art or music or enjoyment of particular leisure activities. As adolescents we might have even wondered, “Why can’t I be like everyone else?” As we grow older, such ruminations tend to shift from what is wrong or unique about me to what’s wrong with them.

But ruminations aside, our phenomenological experience is that we perceive things as they are—that the room really is cold and that Grandma’s concoction really is awful. In the remainder of this chapter, we examine how the tendency to treat our sense of what’s out there as a matter of objective perception rather than subjective interpretation lies at the root of many types of human folly.

Psychologists, following the lead of Lee and his colleagues, refer to the seductive and compelling sense that one sees the world the way it is, and not as a subjective take on the world, as naïve realism. Recognizing that you and everyone else is a naïve realist is a vital step in becoming a wiser person. It will make you wiser about all sorts of experiences you will encounter in your daily life. It can help you deal more effectively with disagreements with friends, family members, and coworkers. It will also make you wiser about political and social issues of great significance at a time when our nation and our troubled world are beset with disagreements and conflicts. But to fully understand how an appreciation of naïve realism can promote the type of wisdom we have in mind, we must back up and ask a more basic question. What gives rise to the conviction that there is a one-to-one relationship between what we experience and what is “out there”?

One of the main jobs of the three pounds of neural circuitry we carry around in our skulls is to make sense of the world around us. That circuitry determines, effortlessly and with dispatch, whether a surface affords walking, an object is benign or threatening, a movement was intentional or random, or a face is novel or familiar. Most of this sense making is done through mental processes that operate without our awareness, leaving us with the sense but no awareness of the making. A host of stealthy mental processes works away without our knowledge or guidance, rendering sensible the barrage of conflicting and confounding information that confronts us. This lack of conscious access to our sense-making machinery leads to confusion between what Immanuel Kant called “the thing as it is” (das Ding an sich) and “the thing as we know it” (das Ding für uns).

When we see a toaster, smell a delicious aroma, or detect a threatening gesture, it feels as if we’re experiencing the stimulus as it is, not as we’ve constructed it. Our own role in the construction of our sensory experiences is perhaps easiest to appreciate when it comes to color vision. It appears to us that the apples we see are red, the oceans blue, and the tall arches near fast food establishments yellow. But the colors we see are not simply “out there” in the objects we perceive; they are the product of the interaction between what’s out there and the functioning of our sensory systems. Our experience of color is the result of the activation of particular photoreceptors that are differentially sensitive to various wavelengths of light striking the retina, as well as further processing of the complex pattern of activation that reverberates higher up in the brain.

It is a telling fact about how thoroughly the brain creates this illusion of red apples, blue oceans, and yellow arches that we commonly say that dogs are color blind (actually they do see colors, but the colors they see are neither as rich nor as varied as we humans see them), yet we never say that we are “odor blind.” We don’t acknowledge that the world really is smellier than it seems, but that we, because of the limitations of our olfactory organs and brains, are able to detect and distinguish only a tiny fraction of the odors that dogs (and almost all other mammals) readily perceive.

Educated adults are aware of the basic facts of color vision, but that awareness in no way alters the perception that color inheres in objects. Nor does it stop us from talking about orange sunsets, blue eyes, and auburn tresses. And when it comes to more complex cognitive events, we are even less aware of our own contribution to our experience. We effortlessly fill in gaps in the sensory signals available to us, without any awareness that there are gaps to be filled—or that we did the filling.

Remarkably, the filling-in can be driven not just by prior information and expectations, but also by information we receive only after the fact. In one telling study, research participants heard sentences with the first part of a key word omitted (which we indicate by “*”), and with different endings of the sentence presented to different participants. Thus, some participants heard “The *eel was on the axle,” and others heard “The *eel was on the orange.” In both cases, the participants reported hearing a coherent sentence—“The wheel was on the axle” in the first case and “The peel was on the orange” in the second—without ever consciously registering the gap. Nor did it register that they themselves had provided the wh or p they “heard” in order to make sense of the sentence.1

Confusing our mental models of the things out there in the world with the things themselves is not of great consequence when everyone else has the same mental model, as they tend to do for apples, the sky, or McDonald’s arches. Nor is it a problem when we all manage to edit out the same speech disfluencies. But this confusion can have less benign consequences when dealing with social problems and policies. This is particularly true when two parties bring very different experiences, priorities, and beliefs to the task of sense making. In such cases, perceptions of what is fair, what is sacred, or who is responsible for the woes of the world are bound to vary. Disagreements are likely to lead to accusations of bad faith or bad character, making those disagreements even harder to resolve. It is in these circumstances that the wisest in the room recognize that their take on “reality” is just that—a take, and not an objective assessment of what “just is.”

You’re driving down the road and see a group of police officers trying to break up a protest in front of a reproductive health clinic. Does it seem that the police are overreacting, curtailing the protesters’ right of assembly? Or is the protest getting out of hand, requiring deft intervention by the police? A remarkable study by Yale Law professor Dan Kahan and his colleagues shows just how much your answers to these questions are likely to be influenced by your political views. Mind you, it is not simply that your political leanings are likely to influence your opinions of the actions of the police or the protestors. They also influence what you see the police and protesters doing.

Kahan and colleagues showed participants segments of an actual conflict between protesters and police that took place in Cambridge, Massachusetts, in 2009.2 Half of the participants were told that the demonstrators were protesting the availability of abortion in front of a reproductive health center; the other half were told they were protesting the military’s “don’t ask, don’t tell” policy in front of a campus military recruitment center. The participants had earlier filled out a survey of their political attitudes and values, and so the investigators had a good sense of whether they were likely to be sympathetic or opposed to a protest against abortion rights or a protest against the don’t ask, don’t tell policy.

The participants with different political outlooks “saw” very different actions on the part of the protesters and police. Three-quarters of the supporters of women’s reproductive rights saw the protesters blocking access to the health center; only a quarter of those from the opposite side of the political spectrum saw them doing so. When participants were told the action took place in front of a military recruitment center, these judgments were reversed: Three-quarters of the more conservative respondents saw the protesters blocking access to the center, compared to only 40 percent of those from the other side of the spectrum. A similar disparity in perceptions was observed when participants were asked whether the protesters had screamed in the faces of those trying to enter the health center vs. the recruitment center.I

The investigators did not ask their participants to discuss the case. We wish they had. It would have been interesting—and informative—to see how they would have dealt with their very different assessments of what they had “seen.” We are all used to dealing with people who have different values and opinions than we do, and while discussions about those differences are not particularly enjoyable, they are usually civilized and we generally make some effort to understand our differences. But when we are challenged about what we consider “the facts,” the discussion heats up and civility often goes out the window.

At the end of Star Trek III: The Search for Spock, after the heroes from the Starship Enterprise have spent more than ninety minutes trying to retrieve their Vulcan friend Spock’s body for proper burial on his home planet, a resurrected Spock gratefully says, “You came back for me.” James Kirk, the captain of the Enterprise, thereupon modestly dismisses any assumed heroism with the assertion “You would have done the same for me.”

Here on earth, this “you would have done the same” conviction is remarkably common. We witness it whenever a person-on-the-street who has administered CPR, saved a drowning child, or run into a burning building to rescue an elderly resident is interviewed. “Anyone would have done the same thing” is the usual reply. We also see it on the other side of the moral spectrum, when those guilty of wrongdoing defend their actions. During the 2005 congressional inquiry into the doping scandals in Major League Baseball, for example, admitted steroid user Mark McGwire said, “Anybody who was in my shoes that had those scenarios set out in front of them would have done the same exact thing.”4 The assumption is so common that it serves as the title of a track from the hip-hop trio Naughty by Nature, “Would’ve Done the Same for Me.”

But is the assumption valid? Or does naïve realism lead us to overestimate the degree to which others share our views and behavioral choices? It does indeed. Because people have the conviction that they see things as they are—that their beliefs, preferences, and responses follow from an essentially unmediated perception of objects, events, and issues—it follows that other rational, reasonable people should reach the same conclusions, provided they have been exposed to the same information. This seemingly reasonable leap gives rise to a phenomenon that Lee and his colleagues dubbed the false consensus effect: People tend to think that their beliefs, opinions, and actions enjoy greater consensus than is really the case. More precisely, people who have a given opinion or preference tend to think that it is more common than do those with the opposite opinion or preference.5

People who prefer Italian to French cinema think their preference is more common than French film enthusiasts do.6 People who are guilty of particular misdeeds think that those deeds are more common than people who wouldn’t dream of such transgressions.7 Liberals think that there is more support for their candidates, and their views on contentious social and political issues, than conservatives do, and vice versa.8 And voters from both sides of the political spectrum think that nonvoters would have voted for their candidate if they had only cast their ballots.II 9

In a vivid illustration of this phenomenon, Lee and his colleagues asked student volunteers to walk around campus wearing a large sandwich-board sign bearing a message (e.g., “Eat at Joe’s”) and to note the reaction of people they encountered. The students, however, were given the opportunity to decline the invitation to participate if they wished (and return for a later study instead). Immediately after agreeing or refusing to participate, the students were asked to estimate the frequency of agreement on the part of other participants and to make inferences about the personal attributes of someone who would accept the experimenter’s invitation and someone who would refuse it.

As predicted, the consensus estimates and trait inferences were very different for the two types of participants. Those who agreed to wear the sign estimated agreement to be more common than refusal and less revealing of the person’s personal attributes. Those who refused to wear it thought that refusal would be more common than agreement and assumed that agreeing to wear the sign said more about a person’s personality.

It is easy to appreciate the role that naïve realism played here. Those who imagined wearing the sign in benign terms—walking relatively unnoticed, explaining to acquaintances that one is taking part in a psychology experiment (and being complimented for being a “good sport”)—would be inclined to agree to the experimenters’ request and to think that most other “normal” students would also agree. For such individuals, the refusal to undertake this task and have such experiences would seem to reflect uncooperativeness, uptightness, or some other departure from normality.

By contrast, those who imagined what it would be like in less positive terms (e.g., walking through throngs of giggling, finger-pointing students; seeing acquaintances shake their heads and avert their gazes as they wordlessly hurry off) would be likely to refuse the experimenters’ request and expect others to refuse. For them, agreeing to wear the sign would seem more reflective of something atypical or negative (e.g., submissiveness or inclination to show off and make a fool of oneself).

The essential dynamic here was recognized long ago by the groundbreaking social psychologist Solomon Asch, who stressed the importance of distinguishing between different “judgments of the object” and different “objects of judgment.”10 When sizing up the responses of their peers, people often fail to take into account the possibility that their peers may be responding to a very different set of “facts” and “circumstances.”

Evidence for this dynamic was offered in a series of studies that Tom conducted.11 If the false consensus effect arises from the failure to recognize that other people may be responding to very different “objects of judgment,” the effect should be greatest when the issue at hand offers the most latitude for different interpretations and for fleshing out details and resolving ambiguities. To test this idea, a panel of judges was asked to rate the items used in Lee’s previous research on the false consensus effect in terms of their ambiguity and latitude for different interpretations. As anticipated, items that offered the most room for different interpretations (“Are you competitive?” “What percentage of your peers are competitive?”) yielded much larger false consensus effects than those for which there was little room for different interpretation (“Are you a first-born or later-born child?” “What percentage of your peers are first-born children?”).

Tom went on to conduct a study inspired by the intense arguments music fans can get into over the relative merits of different eras of popular music. Participants in this study were first asked whether they preferred 1960s or 1980s music and then were asked to estimate the percentage of their peers who would have each preference. As predicted, those who preferred 1960s music thought that more people would share that preference than did those who preferred 1980s music. Conversely, those who preferred the 1980s music thought that more people would share their preference than did those who preferred 1960s music.

The study then zeroed in on the source of these different assessments by asking the participants what particular examples of music they had in mind when they offered their assessments. Those who preferred the music of the 1960s and expected most of their peers to do likewise offered examples of music from the 1960s that independent judges rated highly (the Beatles, the Rolling Stones) and music from the 1980s that independent judges did not like as much (Judas Priest, John Mellencamp). Those who preferred 1980s music listed vastly different examples (Herman’s Hermits and the Ventures as 1960s music and Bruce Springsteen and Michael Jackson as 1980s music). Participants’ preferences, in other words, were certainly a reflection of different musical tastes. But they were also a reflection of the particular examples they happened to generate when answering the question they were asked, and they failed to recognize their own role in fleshing out the two categories when estimating the likely responses of their peers.

This same dynamic plays out in the domain of political discourse. Issues and events that are the object of social, political, or ethical controversy are bound to be construed differently by different individuals. This was illustrated by the study of what people on different sides of the political spectrum saw in a clash between protesters and police. It is also illustrated by the different reactions on the part of the political Left and Right when it comes to the abortion issue, the use of lethal force by the police, and ongoing debates about the treatment of prisoners at Guantánamo Bay.

When Fox News anchors proclaim that the United States should use enhanced interrogation techniques and that those who say otherwise are putting the country at risk, they have in mind harsh physical treatment of those who are indeed determined to kill as many innocent civilians as possible. But when the talking heads at MSNBC take their very different stance, they have in mind the torture of minor al-Qaeda functionaries or innocent individuals who were accused of misdeeds by someone with a personal score to settle.

To be sure, those on the Left and Right would likely disagree about the use of specific interrogation techniques even when an individual’s exact links to a terrorist network are known with certainty. When Dick Cheney says, for example, “I’m more concerned with the bad guys who got out and released [from Guantánamo] than I am with the few that, in fact, were innocent,”12 he is articulating a set of values that few on the Left would endorse. But disagreements on this issue are heightened, and attributions about those on the other side become more malignant, when, in Asch’s memorable language, the participants in the debate are responding to “different objects of judgment.”

This failure to recognize that those with different views may be responding to very different objects of judgment can thus fuel misunderstanding and prolong conflict. It leads disputants to make unwarranted, highly negative inferences about each other’s values, beliefs, compassion, or sincerity—inferences that can serve only to intensify the conflict at hand. Individuals and groups involved in conflict are often urged to walk in one another’s shoes and try to see things through one another’s glasses. Such footwear and eyewear exhortations are easy to offer but difficult to follow. But the wisest ones in the room can at least try to distinguish disagreements about facts and interpretations from disagreements about values and preferences.

Many Americans went to bed on November 7, 2000, thinking that Al Gore had just been elected president. But when they awoke the next morning, they learned that George W. Bush had inched ahead of Gore in the crucial state of Florida with its twenty-five electoral votes, giving him enough votes to stake a claim to the presidency. Because Bush’s margin of victory in Florida was so small (less than half a percent of the votes cast), a vigorous legal battle ensued between the rival campaigns, leading the Florida Supreme Court to order a manual recount of all ballots in the state. The very next day, however, the U.S. Supreme Court granted a stay of the enforcement of the Florida court’s decision. And a few days after that, the U.S. Supreme Court blocked the recount altogether, with the majority arguing that allowing the vote to go forward would violate the equal protection clause of the Fourteenth Amendment.

Democrats were quick to criticize the decision and claim bias on the part of the prevailing justices in the 5–4 decision that split perfectly along liberal-conservative lines. As one legal scholar observed, “I do not know a single person who believes that if the parties were reversed, if Gore were challenging a recount ordered by a Republican Florida Supreme Court . . . [the majority] . . . would have reached for a startling and innovative principle of constitutional law to hand Gore the victory.”13 What seemed to many to be especially suspicious was that the conservative majority on the Court was suddenly willing to insert federal authority in this case despite their frequently expressed reservations about judicial activism and their advocacy of states’ rights and a narrow interpretation of the equal protection clause.

While most Democrats thought the majority’s decision was tainted by ideological and motivational bias, the five justices who wrote the opinion didn’t see it that way. They insisted that they were applying the law in an even-handed fashion. Shortly after the decision, for example, Justice Clarence Thomas told a group of students in Washington, D.C., that the decision was not in any way influenced by partisanship.14 Justice Antonin Scalia has been even more dismissive of any such claim, telling an audience at Wesleyan University to “get over it.”15

Research has shown that the five majority justices in Bush v. Gore are hardly exceptional in this regard. Lee and one of his former students, Emily Pronin, asked people about how susceptible they were to a host of biases that plague human judgment. Their