As anyone who pays attentions to the utterances, tweeted and otherwise, of our President-elect, Donald Trump is, at best, the source of profoundly absurd declarations of nonsense, gleaned from sources that regularly write about a cabal of lizard people ruling the world, rampant voter fraud, a secret fluoride mind control program, and other blatant twaddle.
While the rational among us would at first glance reject such absurdities, they gain traction, eventually forming part of the core belief systems of millions.
The question, then, is why?
In the weeks since the U.S. election, concerns have been raised about the prominence and popularity of false news stories spread on platforms such as Facebook. A BuzzFeed analysis found that the top 20 false election stories generated more shares, likes, reactions and comments than the top 20 election stories from major news organizations in the months immediately preceding the election. For example, the fake article “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement” was engaged with 960,000 times in the three months prior to the election.
Facebook has discounted the analysis, saying that these top stories are only a tiny fraction of the content people are exposed to on the site. In fact, Facebook CEO Mark Zuckerberg has said, “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way – I think is a pretty crazy idea.” However, psychological science suggests that exposure to false news would have an impact on people’s opinions and beliefs. It may not have changed the outcome of the election, but false news stories almost definitely affected people’s opinions of the candidates.
Psychological research, including my own, shows that repeated exposure to false information can change people’s beliefs that is it true. This phenomenon is called the “illusory truth effect.”
This effect happens to us all – including people who know the truth. Our research suggests that even people who knew Pope Francis made no presidential endorsement would be susceptible to believing a “Pope endorses Trump” headline when they had seen it multiple times.
Repetition leads to belief
People think that statements they have heard twice are more true than those they have encountered only once. That is, simply repeating false information makes it seem more true.
In a typical study, participants read a series of true statements (“French horn players get cash bonuses to stay in the U.S. Army”) and false ones (“Zachary Taylor was the first president to die in office”) and rate how interesting they find each sentence. Then, they are presented with a number of statements and asked to rate how true each one is. This second round includes both the statements from the first round and entirely new statements, both true and false. The outcome: Participants reliably rate the repeated statements as being more true than the new statements.
In a recent study, I and other researchers found that this effect is not limited to obscure or unknown statements, like those about French horn players and Zachary Taylor. Repetition can also bolster belief in statements that contradict participants’ prior knowledge.
For example, even among people who can identify the skirt that Scottish men wear as a kilt, the statement “A sari is the skirt that Scottish men wear” is rated as more true when it is read twice versus only once. On a six-point scale, the participants’ truth ratings increased by half a point when the known falsehoods were repeated. The statements were still rated as false, but participants were much less certain, rating the statements as “possibly false” rather than closer to “probably false.”
This means that having relevant prior knowledge does not protect people from the illusory truth effect. Repeated information feels more true, even if it goes against what you already know.
Even debunking could make things worse
Facebook is looking at ways to combat fake news on the site, but some of the proposed solutions are unlikely to fix the problem. According to a Facebook post by Zuckerberg, the site is considering labeling stories that have been flagged as false with a warning message. While this is a commonsense suggestion, and may help to reduce the sharing of false stories, psychological research suggests that it will do little to prevent people from believing that the articles are true.
People tend to remember false information, but forget that it was labeled as false. A 2011 study gave participants statements from sources described as either “reliable” or “unreliable.” Two weeks later, the participants were asked to rate the truth of several statements – the reliable and unreliable statements from before, and new statements as well. They tended to rate the repeated statements as more true, even if they were originally labeled as unreliable.