Advertisement

SKIP ADVERTISEMENT

A Better Way to Think About Conspiracies

People will always be interested in conspiracy theories. They need a tool kit for discriminating among different fringe ideas.

Credit...Marcus Schaefer/Trunk Archive

Opinion Columnist

No problem concerns journalists and press-watchers so much these days as the proliferation of conspiracy theories and misinformation on the internet. “We never confronted this level of conspiracy thinking in the U.S. previously,” Marty Baron, the former executive editor of The Washington Post, told Der Spiegel in a recent interview. His assumption, widely shared in our profession, is that the internet has forged an age of false belief, encouraged by social media companies and exploited by Donald Trump, that requires new thinking about how to win the battle for the truth.

Some of that new thinking leads to surprising places. For instance, my colleague Kevin Roose recently reported that some experts wish that the Biden administration would appoint a “reality czar” — a dystopian-sounding title, he acknowledged, for an official charged with coordinating anti-disinformation efforts — as “the tip of the spear for the federal government’s response to the reality crisis.”

Meanwhile, my fellow Opinion writer Charlie Warzel recently explored the work of the digital literacy expert Michael Caulfield, who argues that the usually laudable impulse toward critical thinking and investigation is actually the thing that most often leads online information-seekers astray. Instead of always going deeper, following arguments wherever they seem to lead, he suggests that internet users be taught to simplify: to check arguments quickly against mainstream sources, determine whether a given arguer is a plausible authority, and then move on if the person isn’t.

I’m pretty doubtful of the “reality czar” concept, but Caulfield’s arguments were more interesting. We should be skeptical that the scale of conspiracy thinking today is a true historical novelty; the conspiracy theories of the Revolutionary era, for instance, would be entirely at home on today’s internet. But we’re clearly dealing with a new way in which people absorb and spread conspiracies, and a mind-altering technology like the internet probably does require a new kind of education, to help keep people from losing their senses in the online wilds or settling in as citizens of partisan dreamscapes.

But that education won’t be effective if it tells a too simplistic story, where all consensus claims are true and all conspiracy theories empty. In reality, a consensus can be wrong, and a conspiracy theory can sometimes point toward an overlooked or hidden truth — and the approach that Caulfield proposes, to say nothing of the idea of a centralized Office of Reality, seem likely to founder on these rocks. If you tell people not to listen to some prominent crank because that person doesn’t represent the establishment view or the consensus position, you’re setting yourself up to be written off as a dupe or deceiver whenever the consensus position fails or falls apart.

I could multiply examples of how this falling apart happens — I am old enough to remember, for instance, when only cranks doubted that Saddam Hussein had weapons of mass destruction — but the last year has given us a thuddingly obvious case study: In January and February of 2020, using a follow-the-consensus method of online reading could have given you a wildly misleading picture of the disease’s risks, how it was transmitted, whether to wear masks and more.

Is there an alternative to leaning so heavily on the organs of consensus? I think there might be. It would start by taking conspiracy thinking a little more seriously — recognizing not only that it’s ineradicable, but also that it’s a reasonable response to both elite failures and the fact that conspiracies and cover-ups often do exist.

If you assume that people will always believe in conspiracies, and that sometimes they should, you can try to give them a tool kit for discriminating among different fringe ideas, so that when they venture into outside-the-consensus territory, they become more reasonable and discerning in the ideas they follow and bring back.

Here are a few ideas that belong in that kind of tool kit.

Consider two theories about Covid-19: the conceit that it was designed by the Gates Foundation for some sort of world-domination scheme, and the theory that it was accidentally released by a Chinese virology lab in Wuhan, a disaster that the Beijing government then sought to cover up. If you just follow the official media consensus, you’ll see both these theories labeled misinformation and conspiracy. But in fact the two are wildly different, and the latter is vastly more plausible than the former — so plausible that it might even be true.

What makes it plausible is that it doesn’t depend on some complex plot for a one-world government; it just depends on the human and bureaucratic capacity for error and the authoritarian tendency toward cover-up. And this points to an excellent rule for anyone who looks at an official narrative and thinks that something seems suspicious: In following your suspicions, never leap to a malignant conspiracy to explain something that can be explained by incompetence and self-protection first.

After the November election, I spent a fair amount of time arguing with conservatives who were convinced that it had been stolen for Joe Biden, and after a while I noticed that I was often playing Whac-a-Mole: They would raise a fishy-seeming piece of evidence, I would show them something debunking it, and then they would just move on to a different piece of evidence that assumed a different kind of conspiracy — shifting from stuffed ballot boxes in urban districts to computer shenanigans in suburban districts, say — without losing an iota in their certainty.

That kind of shift doesn’t prove the new example false, but it should make you suspect that what’s happening is a search for facts to fit a predetermined narrative, rather than just the observation of a suspicious fact with an open mind about where it leads. If you’re reading someone who can’t seem to internalize the implications of having an argument proved wrong, or who constantly cites easily discredited examples, you’re not being discerning; you’ve either wandered into someone’s ideological fixation or you’re a mark for intentional fake news.

For example: If you tell me that the C.I.A. killed John F. Kennedy, I will be dismissive, because the boring official narrative of his assassination — hawkish president killed by a Marxist loner who previously tried to assassinate a right-wing general — fits the facts perfectly well on its own. But if you tell me that some mysterious foreign intelligence agency was involved in Jeffrey Epstein’s strange career, I will be more open to your theories, because so much about Epstein’s dizzying ascent from prep school math teacher to procurer to the famous and the rich remains mystifying even now.

Likewise, every fringe theory about U.F.O.s — that they’re some kind of secret military supertechnology, that they’re really aliens, that they’re something stranger still — became a lot more plausible in the last couple of years, because the footage released by Pentagon sources created a mystery that no official or consensus narrative has adequately explained.

This kind of slippage is clearly a feature of conspiratorial thinking: Joining an out-group that holds one specific outlandish opinion seems to encourage a sense that every out-group must be on to something, every outlandish opinion must be right. Thus the person who starts out believing that Epstein didn’t kill himself ends up going full QAnon. Or the person who decides that the Centers for Disease Control is wrong about their chronic illness ends up refusing chemotherapy for cancer.

But at the same time, there is no intellectually necessary reason why believing in one piece of secret knowledge, one specific conspiracy theory, should require a general belief in every fringe idea.

Here revealed religion offers a useful model. To be a devout Christian or a believing Jew or Muslim is to be a bit like a conspiracy theorist, in the sense that you believe that there is an invisible reality that secular knowledge can’t recognize and a set of decisive events in history that fall outside of nature’s laws.

But the great religions are also full of warnings against false prophets and fraudulent revelations. My own faith, Roman Catholicism, is both drenched in the supernatural and extremely scrupulous about the miracles and seers that it validates. And it allows its flock to be simply agnostic about a range of possibly supernatural claims and phenomena, to allow that they might be real, or might not, without making them the basis of your faith.

Some version of that careful agnosticism, that mixture of openness and caution, seems like a better spirit with which to approach the internet and all its rabbit holes than either a naïve credulity or a brittle confidence in mainstream media consensus. And I suspect that’s actually what a lot of polling on conspiracy theories traditionally captures: not a blazing certainty about what really happened on 9/11 or who killed Kennedy or how “they” faked the moon landing, but a kind of studied uncertainty about our strange world and its secrets.

What we should hope for, reasonably, is not a world where a “reality czar” steers everyone toward perfect consensus about the facts, but a world where a conspiracy-curious uncertainty persists as uncertainty, without hardening into the zeal that drove election truthers to storm the Capitol.

It’s that task that our would-be educators should be taking up: not a rigid defense of conventional wisdom, but the cultivation of a consensus supple enough to accommodate the doubter, instead of making people feel as if their only options are submission or revolt.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTOpinion) and Instagram.

Ross Douthat has been an Opinion columnist for The Times since 2009. He is the author of several books, most recently, “The Decadent Society.” @DouthatNYT Facebook

Advertisement

SKIP ADVERTISEMENT