How do you debunk a conspiracy theory?
Suppose people think that Israel carried out the 9/11 attacks or that the moon landing was faked. Or that Koch money or Hillary Clinton or Pete Buttigieg was behind the Iowa caucus fiasco, or that the coronavirus comes from a fiendish plot by multinational corporations.
Conspiracy theorists tend to be emotionally invested in their beliefs, meaning that if you contradict them, you might make them angry. And if you offer them evidence that they’re wrong, you might make them angrier still — and so strengthen their commitment to their belief.
Social scientists have found that, in some contexts, corrections actually backfire.
If, for example, people still think that the Affordable Care Act contains death panels, a correction can make those people even more certain that the law contains death panels.
One reason is that when people are told they’re wrong, they are immediately put on the defensive, and they work hard to defend their beliefs. Another reason is pure suspicion: Why would anyone bother to deny it, if it isn’t true?
Apart from their emotional commitment to their beliefs, conspiracy theorists usually know a great deal. They’ve studied the subject at hand. If the issue involves the assassination of John F. Kennedy, the moon landing, the attacks of 9/11 or the Democratic caucuses in Iowa, there’s a good chance that the conspiracy-minded will have a lot of (apparent) evidence to support their theories. It’s not easy to convince such people that they’re full of nonsense.
If you’re looking for help, some helpful hints come from recent research from Thomas Wood of Ohio State University and Ethan Porter of George Washington University. Studying the views of 10,100 people on about 52 different issues, Wood and Porter find that correcting people’s false beliefs about the facts really can work — and for Democrats and Republicans alike — in certain cases.
For example, many Americans believe that the abortion rate is rapidly increasing; that undocumented immigrants commit crimes at higher rates than the general population; that teen pregnancy is on the rise; and that U.S. taxes are the highest in the world. All of these beliefs are false.
After survey participants read a factual correction on these questions, they tended to believe it. This is so even if their previous belief fit well with their political convictions — and even if the correction suggested that their preferred political party was spouting falsehoods.
But Wood and Porter were careful to use corrections from generally trusted institutions, lacking any obvious political affiliation: the Congressional Research Service, the Bureau of Labor Statistics, the Federal Bureau of Justice Statistics, the Organization for Economic Cooperation and Development. People could not easily respond to the corrections by dismissing those who were responsible for them.
Though none of these corrections involved conspiracy theories, the central finding offers a large lesson: A conspiracy theorist is likely to care a lot about the source of an attempted refutation — perhaps even more than about its content.
If Exxon says it did not conspire to suppress information about the risks of climate change, it won’t be especially credible. But if environmental organizations (such as the Sierra Club and Greenpeace) and other oil-industry critics drew the line to defend prominent companies against specific accusations that underlie conspiracy theories, people might be moved. For those who want to refute such theories, it’s important to find “surprising debunkers” — people who are not merely trusted but who are also expected to be on the same side as those who embrace the theories.
A great deal of work suggests that conspiracy theorists feel fearful and threatened, and perceive a lack of control over their own lives. Finding patterns in seemingly random events helps to restore a sense of control, even or perhaps especially if the patterns seem to reveal some kind of conspiracy.
But if people are given a feeling of control or are affirmed, even momentarily, they might be willing to agree that the supposed patterns are illusory.
There is a corollary. Conspiracy theorists are often members of some tribe, with a shared set of political, ethical or religious convictions. Smart debunkers begin by suggesting that they are part of the same tribe, or that they like and respect its members and agree with them on important matters. A gesture like that can go a long way toward increasing receptivity to what debunkers have to say.
To be sure, there is a risk that some conspiracy theorists will only hear the conciliatory parts of what the debunkers are telling them, and ignore the rest. But an effort to show respect, and to find significant common ground, can open minds, even among those who seem strongly committed to outlandish beliefs.