Lynn Vavreck is a professor of political science at UCLA and co-author of “The Gamble,” about the 2012 presidential campaign. This column appeared in the New York Times.
“What one does when faced with the truth is more difficult than you would think.” — Wonder Woman, in the movie released last month.
This goes for all of us, not just superheroes.
The nuances of how people react when faced with the truth have come into focus in today’s increasingly polarized political climate.
A decade ago, the comedian Stephen Colbert introduced viewers to the idea of “truthiness,” a quality belonging to claims that were based on gut feelings instead of facts. Last year, the Oxford Dictionaries named the word “post-truth” the word of the year. It’s a description of a general characteristic of our time — objective facts becoming less influential than emotions or personal beliefs.
Norbert Schwarz, co-director of the University of Southern California’s Mind and Society Center, and his colleagues say the process goes something like this: When people consider whether something is true or not, they engage in either analytic or intuitive evaluations. Analytic evaluations are cognitively taxing and may involve searching for information like knowledge drawn from books or experts. Intuitive evaluations require less effort and are largely based on gut feelings like familiarity or ease of understanding.
We all use both methods depending on what we are thinking about, like where to go for a haircut or which person we should choose to fix our refrigerator. You may know something about how to cut hair or fix appliances, but most likely you rely on some helpful shortcuts. Maybe you saw people coming out of a few salons and decided to go with the one whose results you liked best after days of observation. But more likely you relied on the recommendation of a friend or the convenience of the shop whose very professional sign you pass every day on your way home.
Whether you end up in the analytic or intuitive camp has as much to do with how familiar the claim feels to you and how easily the story flows for you as it does how much objective information you have about a topic, even a political one.
Let’s start with what psychologists call social consensus. This is the idea that if lots of people believe something it is likely to be true. People, it turns out, have been shown to be more confident in their beliefs if others share them. Social consensus turns on the question, “Do other people believe the claim?”
Analytic evaluations of this criterion might involve the use of poll results or supporting statistics. When this kind of evidence is hard to acquire, the gut can substitute a look-alike piece of evidence — familiarity. In the absence of data about whether people believe something, sometimes people simply ask themselves, “Have I heard this before?” The logic is simple. If many people believe something, you’ve probably heard it repeated a few times. And if you’ve heard it a few times, that makes it familiar. Therefore, if it’s familiar, you conclude others believe it.
This means that candidates, groups or anyone looking to influence opinions can increase the likelihood that people believe their claims by making them seem familiar. When President Trump starts a statement with the phrase, “Lots of people are saying,” he is generating a sense of social consensus. When he repeats something many times in the same speech, as he did in his first interview as president with David Muir of ABC, he is making his claim seem more familiar to listeners and increasing its believability.
Look at the ways Mr. Trump creates social consensus in just one exchange in the interview: In response to a question from Mr. Muir, he said, “You know what’s important, millions of people agree with me.”
“All of the people,” he followed up, “they’re saying: ‘We agree with Mr. Trump. We agree.’” He is making the things he says seem more familiar to listeners who are processing intuitively.
You might think people would discount information repeated by the same speaker in a short sequence, but you’d be wrong. In one experiment, researchers showed participants text that was repeated on a page, clearly the result of an error at the print shop. Even in this circumstance, repeating the information increased its credibility. It matters more that ideas are repeated than how they are repeated.
Another criterion is compatibility — the impression that a claim fits with what you already believe or feel. The more the information fits, the more likely you are to accept it as true.
There is an analytic and intuitive way to assess compatibility, too. To nudge someone from the intuitive to analytic evaluation on this yardstick, all you have to do is make the task feel less easy and familiar.
Psychologists in one study asked people how many animals of each kind Moses took on the Ark. Most people replied “two” despite the fact that it was Noah, not Moses, who populated the Ark. When the question flowed less smoothly — because it was written in a hard-to-read typeface — people were much more likely to notice the mistake in the question. Engaging the brain by slowing it down can help people appreciate incompatibility and reject claims they might otherwise accept as true.
Evaluating claims intuitively rather than analytically may interact with the way many people cultivate information these days — via social media — in a worrisome way. Familiarity of messages and ideas increases with reposting, retweeting or sharing. And since most people are friends with people who are like them — even politically — the increasingly familiar messages are one-sided. A false sense of social consensus develops. Coherence and compatibility make the stories flow more smoothly, and as a result, new information is intuitively accepted as true because it feels right.
In these cases even Wonder Woman, with her Golden Lasso of Truth, may be of little help. As the political scientist and Upshot contributor Brendan Nyhan has written, correcting facts that are wrong or misleading may be successful some of the time, but people’s psychological attachments to their political party make them likely to resist information presented even by objective sources.
So with battles between the head and the gut rampant, what’s the future of truth in this post-truth era? The evidence from social psychology suggests all is not lost. Simply telling people they are wrong is unlikely to change many minds, but making the flow of false claims a bit more bumpy can push people into analytic evaluations rather than intuitive ones.
Even when that isn’t possible, introducing ideas that are different and making them familiar may slowly change people’s thoughts about what “many people” believe. And that can change notions about social consensus. This is harder than it sounds, but musicians, novelists and filmmakers engage people with new experiences and points of view all the time. “The Mary Tyler Moore Show” made the idea of a single woman in the workplace familiar to many Americans, the same way Aziz Ansari’s “Master of None” helps non-millennials understand the particular challenges and charms of being a young person in the digital age.
Even better than books, film, and television is actual conversation — getting out of the post-truth era may require listening to people on the other side. The goal is to make what feels true compatible with what is true.