How to Deprogram Truth-Denying Trump Voters

Trump didn't invent the term "fake news," but his campaign did use it enough to manipulate voters that it's no wonder he thinks so.

How did Donald Trump win, when he used so many misleading statements and outright deceptions? Couldn’t people see through them? As an expert in brain science, I want to share why his followers fell for his lies and what can be done to address this situation in the future.

First, let’s get the facts straight. At the time of writing, Politifact.com, a well-known non-partisan website, rates only about 4 percent of statements by Trump as fully “True” and over 50 percent as either completely “False” or what they call ridiculously false—“Pants on Fire,” with the rest in the middle. By comparison, Hillary Clinton rated 25 percent as fully “True” and only 12 percent as either “False” or “Pants on Fire.”

The Washington Post, one of the most reputable newspapers in the country, wrote that “There’s never been a presidential candidate like Donald Trump—someone so cavalier about the facts and so unwilling to ever admit error, even in the face of overwhelming evidence.” In their rulings on statements made by Trump, the paper’s editors evaluated 64 percent of them as Four Pinocchios, their worst rating. By contrast, statements by other politicians tend to get the worst rating 10 to 20 percent of the time.

These sentiments are representative of other prominent news media and fact-check outlets, yet according to an ABC News/Washington Post poll, most voters on the eve of the election perceived Donald Trump as more trustworthy than Hillary Clinton. This false perception came from the Trump campaign building upon previous Republican criticism of Clinton, much of it misleading and some accurate, to manipulate successfully many voters into believing that Clinton is less honest, in spite of the evidence that she is much more honest than Trump. The Trump campaign did so through the illusory truth effect, a thinking error in our minds that happens when false statements are repeated many times and we begin to see them as true. In other words, just because something is stated several times, we perceive it as more accurate.

You may have noticed the last two sentences in the previous paragraph had the same meaning. The second sentence didn’t provide any new information, but it did cause you to believe my claim more than you did when you read the first sentence.

The Biology of Truth Vs. Comfort

Why should the human brain be structured so that mere repetition, without any more evidence, causes us to believe a claim more strongly? The more often we are exposed to a statement, the more comfortable it seems. The fundamental error most people make is mistaking statements that make them feel comfortable for true statements.

Our brains cause us to believe something is true because we feel it is true, regardless of the evidence—a phenomenon known as emotional reasoning. This strange phenomenon can be easily explained by understanding some basic biology behind how our brain works.

When we hear a statement, the first thing that fires in our brain in a few milliseconds is our autopilot system of thinking, composed of our emotions and intuitions. Also known as System 1, the autopilot system is what the Nobel Prize-winning scientist Daniel Kahneman identified as our two systems of thinking in his 2011 ThinkingFast and Slow, and represents the more ancient system of our brain. It protected us in the ancestral environment against dangerous threats such as saber-toothed tigers by making us feel bad about them and drew us toward what we needed to survive such as food and shelter by making us feel good about them. The humans who survived learned well to heed the autopilot system’s guidance, and we are the children of these humans.

Unfortunately, the autopilot system is not well calibrated for the modern environment. When we hear statements that go against our current beliefs, our autopilot system perceives them as threats and causes us to feel bad about them. By contrast, statements that align with our existing beliefs cause us to feel good and we want to believe them. So if we just go with our gut reactions—our lizard brain—we will always choose statements that align with our current beliefs.

Where Do We Get Our News?

Until recently, people got all their news from mainstream media, which meant they were often exposed to information that they didn’t like because it did not fit their beliefs. The budget cuts and consolidation of media ownership in the last decade resulted in mainstream media getting increasingly less diverse, well described in the 2009 Media Ownership and Concentration in America by Eli Noam. Moreover, according to a 2016 survey by Pew Research Center, many people are increasingly getting their news mainly or only from within their own personalized social media filter bubble, which tends to exclude information that differs from their own beliefs. So their own beliefs are reinforced and it seems that everyone shares the same beliefs as them.

This trend is based on a traditional strong trust in friends as sources of reliable recommendations, according to the 2015 Nielsen Global Trust in Advertising Report. Our brains tend to spread the trust that we associate with friends to other sources of information that we see on social media. This thinking error is known as the halo effect when our assessment of one element of a larger whole as positive transfers to other elements. We can see this in research showing that people’s trust in social media influencers has grown over time, nearly to the level of trust in their friends, as shown by a 2016 joint study by Twitter and analytics firm Annalect.

Even more concerning, a 2016 study from Stanford University demonstrated that over 80 percent of students, who are generally experienced social media users, could not distinguish a news story shared by a friend from a sponsored advertisement. In a particularly scary finding, many of the study’s participants thought a news story was true based on irrelevant factors such as the size of the photo, as opposed to rational factors such as the credibility of the news source outlet.

The Trump team knows that many people have difficulty distinguishing sponsored stories from real news stories, and that’s why they were at the forefront of targeting voters with sponsored advertorials on social media. In some cases they used this tactic to motivate their own supporters, and in others they used it as a voter suppression tactic against Clinton supporters. The Trump campaign’s Republican allies created fake news stories that got millions of shares on social media. The Russian propaganda machine has also used social media to manufacture fake news stories favorable to Trump and critical of Clinton.

Additionally, Trump’s attacks on mainstream media and fact-checkers before the election, and even after the election, undercut the credibility of news source outlets. As a result, trust in the media amongst Republicans dropped to an all-time low of 14 percent in a September 2016 Gallup poll, a drop of over 200 percent from 2015. Fact-checking is even less credible among Republicans, with 88 percent expressing distrust in a September 2016 Rasmussen Reports poll.

All this combined in the unprecedented reliance on and sharing of fake news by Trump’s supporters on social media. With the rise of the Tea Party, a new study by the Center for Media and Public Affairs (CMPA) at George Mason University used PolitiFact to find that Republicans have tended to make many more false statements than Democrats. Lacking trust in the mainstream media and relying on social media instead, a large segment of Trump’s base indiscriminately shared whatever made them feel good, regardless of whether it was true. Indeed, one fake news writer, in an interview with The Washington Post, said of Trump supporters: “His followers don’t fact-check anything—they’ll post everything, believe anything.” No wonder that Trump’s supporters mostly believe his statements, according to polling. By contrast, another creator of fake news, in an interview with NPR, described how he “tried to write fake news for liberals—but they just never take the bait” due to them practicing fact-checking and debunking.

This fact-checking and debunking illustrates that the situation, while dismal, is not hopeless. Such truth-oriented behaviors rely on our other thinking system, the intentional system or System 2, as shown by Chip and Dan Heath in their 2013 book Decisive: How to Make Better Choices in Life and Work. The intentional system is deliberate and reflective. It takes effort to use, but it can catch and override the thinking errors committed by System 1 so that we do not adopt the belief that something is true because we feel it is true, regardless of the evidence.

Many liberals associate positive emotions with empirical facts and reason, which is why their intentional system is triggered into doing fact-checking on news stories. Trump voters mostly do not have such positive emotions around the truth, and believe in Trump’s authenticity on a gut level regardless of the facts. This difference is not well recognized by the mainstream media, who treat their audience as rational thinkers and present information in a language that communicates well to liberals, but not to Trump voters.

To get more conservatives to turn on the intentional system when evaluating political discourse we need to speak to emotions and intuitions—the autopilot system, in other words. We have to get folks to associate positive emotions with the truth first and foremost, before anything else.

To do so, we should understand where these people are coming from and what they care about, validate their emotions and concerns, and only then show, using emotional language, the harm people suffer when they believe in lies. For instance, for those who care about safety and security, we can highlight how it’s important for them to defend themselves against being swindled into taking actions that make the world more dangerous. Those concerned with liberty and independence would be moved by emotional language targeted toward keeping themselves free from being used and manipulated. For those focused on family values, we may speak about trust being abused.

These are strong terms that have deep emotional resonance. Many may be uncomfortable with using such tactics of emotional appeals. We have to remember the end goal of helping people orient toward the truth. This is a case where ends do justify the means. We need to be emotional to help people grow more rational—to make sure that while truth lost the battle, it will win the war.

A Shared Orientation Toward the Truth: The Pro-Truth Pledge

You can use these same strategies in your everyday conversation with people who let their ideological perspectives cloud their evaluation of reality. An excellent way to encourage a mutual orientation toward the truth and bridge the political divide is to get all participants in a conversation to take the Pro-Truth Pledge, a recent behavioral science instrument designed to reverse the tide of lies in our public sphere. I had interviews on both conservative and liberal shows where the hosts took the pledge, which then shaped our conversations in a highly productive manner oriented toward an accurate evaluation of reality. Likewise, a number of pledge-takers have also exhibited more truth-oriented behavior as a result of taking the pledge.

For example, a liberal candidate for Congress in Idaho, Michael Smith, who took the Pro-Truth Pledge, posted on his Facebook wall a screenshot of a tweet by Trump criticizing minority and disabled children. After being questioned on whether this was an actual tweet or photoshopped one, the candidate searched Trump’s feed. He could not find the original tweet, and while Trump may have deleted that tweet, the candidate edited his own Facebook post to say that “Due to a Truth Pledge I have taken I have to say I have not been able to verify this post.” He indicated that he would be more careful with future postings.

A US Army veteran and member Special Operations community, and advocate for reason, John Kirbow, took the pledge. He then wrote a blog post about how it impacted him. He notes that “I’ve verbally or digitally passed on bad information numerous times, I am fairly sure, as a result of honest mistakes or lack of vigorous fact checking.” He describes how after taking the pledge, he felt “an open commitment to a certain attitude” and as a result, “think hard when I want to play an article or statistic which I’m not completely sold on.” Having taken it, he found that the Pro-Truth Pledge “really does seem to change one’s habits” and helps push him both to correct his own mistake with an “attitude of humility and skepticism, and of honesty and moral sincerity,” and also to encourage “friends and peers to do so as well.”

I hope these strategies, together with the Pro-Truth Pledge at ProTruthPledge.org, empower you to address alternative facts!

This article is an excerpt from the forthcoming The Truth-Seeker’s Handbook: A Science-Based Guide. To learn more about the book and be notified of its publication, click on this link.

 

 

Related Stories

No comments:

Powered by Blogger.