#Editorial

Covid-19, vaccine hesitancy and the misinformation conundrum!

Sep 10, 2021, 11:34 AM

It’s hard to teach an algorithm to identify misinformation when humans themselves can’t agree on what misinformation is — and when political leaders can’t decide whether we should have more or less of whatever it entails.

Lately, vaccine hesitance has been calcifying into outright vaccine refusal. That’s partly because so many people have been fed a steady diet of misinformation and conspiracy theories about vaccine risks.

Roughly 90 percent of Americans who don’t plan to get vaccinated say they fear possible side effects from the shot more than they fear covid-19 itself, a recent YouGov poll found. Roughly half of those who reject the vaccine believe the U.S. government is using the vaccine to microchip the population. (Hey, that would at least explain the global chip shortage.)

Where are people getting these kooky ideas? Politicians and pundits have been quick to blame social media platforms.

That’s understandable. Misinformation has flourished on Facebook and other sites for many years. Unlike truths, lies are unconstrained by reality, which means they can be crafted to be maximally interesting, sexy, terrifying.

In other words, they’re optimized to generate traffic, which happens to be good for tech companies’ bottom lines. “Fake news” -whether fashioned by enterprising Macedonian teenagers, malicious state actors, U.S. political groups, snake-oil salesmen or your standard-issue tinfoil-hatters — drove tons of engagement on these sites in the lead-up to the 2016 election and has continued to do so.

Whether out of principle or financial self-interest, tech executives initially said they weren’t in the business of taking down content simply because it was false. (This included, infamously, Holocaust-denial claims.) Intense blowback followed, along with pressure for tech companies to recognize how their tools were being exploited to undermine democracy, stoke violence and generally poison people’s brains; the firms have since ramped up fact-checking and content moderation.

During the pandemic, Facebook has removed “over 18 million instances of COVID-19 misinformation” and made less visible “more than 167 million pieces of COVID-19 content debunked by networks of fact-checking partners,” the company wrote in a blog post over the weekend.

This was in response to President Biden’s comments Friday that social media platforms were “killing people” by allowing vaccine misinformation to flourish.

On the one hand, yes, social media companies absolutely still can and must do more to scrub misinformation from their platforms. Case in point: A recent report on the “Disinformation Dozen” estimated that 12 accounts are responsible for 65 percent of anti-vaccine content on Facebook and Twitter. Their claims include that vaccines have killed more people than covid and are a conspiracy to “wipe out” Black people. All 12 remain active on at least Facebook or Twitter.

But on the other hand: Actually doing more to stamp out this misinformation is challenging. Not because these firms lack the workers or technology to identify problematic content; the real obstacle is political.

Politicians of both parties hate Big Tech’s approach to content moderation and think it should change — but propose diametrically opposite directions.

A Guest Editorial