Skip to main content
You have permission to edit this article.
Edit

Faye Flam: Facebook, YouTube erred in censoring

  • 0
Faye Flam
{{featured_button_text}}

Labeling misinformation online is doing more harm than good. The possibility that COVID-19 came from a lab accident is just the latest example. Social media companies tried to suppress any discussion of it for months. But why? There’s no strong evidence against it, and evidence for other theories is still inconclusive. Pathogens have escaped from labs many times, and people have died as a result.

Social media fact-checkers don’t have any special knowledge or ability to sort fact from misinformation. What they have is extraordinary power to shape what people believe. And stifling ideas can backfire if it leads people to believe there’s a “real story” that is being suppressed.

Misinformation is dangerous. It can keep people from getting lifesaving medical treatments, including vaccines. But flagging it doesn’t necessarily solve the problem. It’s much better to provide additional information than to censor information.

Part of the problem is that people think they know misinformation when they see it. And those most confident of their ability to spot it may be least aware of their own biases. That includes the fact-checking industry within the mainstream media, who were caught removing earlier posts on the lab leak theory, as well as social media “fact checkers” who aren’t accountable to the public.

People are also reading…

Earlier this year, I interviewed physician and medical podcaster Roger Seheult who said that he was censored by YouTube for discussing the clinical trials of hydroxychloroquine and Ivermectin as potential COVID-19 treatments. No wonder so many people still believe these are the cures “they” don’t want you to know about. Much better would be an open discussion of the clinical trial process, which could help people understand why scientists think those drugs are unlikely to help.

Even without the power of censorship, social media culture encourages the facile labeling of ideas and people as a way of dismissing them — it’s easy to call people deniers or as anti-science because they question prevailing wisdom.

Of course, there are ideas that are very unlikely to be true. These generally involve elaborate conspiracies or a complete overhaul in our understanding of the universe. Or, like cold fusion and the vaccine-autism theory, they’ve been tested and debunked multiple times by independent investigators.

I discussed the new interest in the lab leak with another science journalist who was interested in why so many reporters are still treating the natural spillover hypothesis as the only possibility. We agreed this isn’t like the connection between carbon emissions and climate change, where there’s a scientific consensus based on years of research and multiple, independently-derived lines of evidence. Here, even if a few scientists favored the natural spillover early on, the question is still open.

Last year, some scientists rightly objected that accusing any lab of causing a worldwide pandemic is a serious charge and one shouldn’t be made on the basis of proximity alone. That doesn’t mean we should ignore the possibility, or assume that some other equally unproven idea is right. In the face of an unknown, why would the fact-checking people deem one guess to be a form of misinformation, and another guess to be true?

And the lab leak idea got conflated in some people’s minds with conspiracy theories that the virus was deliberately created and released for population control or some other nefarious agenda. But a lab leak could have involved a perfectly natural virus that a scientist collected, or virus that was altered in some well-intentioned attempt to understand it.

A New York Magazine piece by Nicholson Baker didn’t claim any smoking gun, but made a convincing case that the issue was still open. A Medium piece by former Times writer Nicholas Wade added little to what Baker said, but came at a time when the pubic was ready to reconsider. A recent Vanity Fair account details how the issue was suppressed inside the US government.

Maybe the intentions of the Facebook fact checkers were good. If there was magical way to identify misinformation, then social media platforms could do more to refrain from spreading it. Suppressing ideas they don’t like isn’t the way.

Faye Flam is a Bloomberg Opinion columnist.

0
0
0
0
0

Catch the latest in Opinion

* I understand and agree that registration on or use of this site constitutes agreement to its user agreement and privacy policy.

Related to this story

Most Popular

Opinion: "Gallup poll shows 158 million around the world, including 40 million in Latin America, would like to come here. A fraction of that could destroy us, of course."

Get up-to-the-minute news sent straight to your device.

Topics

News Alerts

Breaking News