On Friday, President Biden called out Facebook over misinformation on its platform related to Covid-19. In response to a question from a reporter who asked what his message was to platforms with regard to Covid-19 misinformation. “They’re killing people,” Biden responded. 

Biden later walked back those statements, clarifying that he had recently read that most of the misinformation on Facebook relating to Covid-19 vaccines originates from 12 accounts. “Facebook isn’t killing people, these 12 people are out there giving misinformation,” Biden said. “Anyone listening to it is getting hurt by it. It’s killing people. It’s bad information.”

After Biden’s original comment, Facebook took exception and published a blog post asking the administration to stop pointing fingers at the social media giant. The post is literally titled “Moving Past the Finger Pointing.” 

At a time when COVID-19 cases are rising in America, the Biden administration has chosen to blame a handful of American social media companies. While social media plays an important role in society, it is clear that we need a whole of society approach to end this pandemic. And facts–not allegations–should help inform that effort.

It’s interesting that Facebook seems sensitive to Biden’s comments, despite the fact that the previous occupant of the White House was known to make incendiary remarks pointed at social media on a regular basis. The company seemed to take a much more friendly approach with the Trump Administration.

Then again, it’s understandable why Facebook is so sensitive to the subject. No company wants to be labeled as responsible for causing people to die–especially not by the man who has arguably the largest megaphone in the world, the President of the United States. 

Plus, Facebook is under intense pressure from regulators and lawmakers over a range of issues, including whether it does too much or not enough about content moderation, depending on which side of the political spectrum you’re on. It’s also facing the possibility that the FTC will refile an amended version of its antitrust lawsuit against the company. 

Before we get too far, I think we should acknowledge that it gets messy whenever any platform begins to censor content based on what someone thinks is misinformation. It’s even messier when that someone is the federal government. Even if we can all agree that misinformation is bad (as a principle), there’s still a lot of room for interpretation as to what qualifies. 

Who decides what is factual information when the topic is nuanced and up for debate? It’s not hypothetical considering the number of times conventional wisdom has turned during the pandemic. It can get tricky very quickly. 

Facebook’s blog post, authored by Guy Rosen, Facebook’s VP of integrity, talks about what the social media platform is doing to mitigate the influence of misinformation, especially as it relates to vaccines:

Since the beginning of the pandemic we have removed over 18 million instances of COVID-19 misinformation. We have also labeled and reduced the visibility of more than 167 million pieces of COVID-19 content debunked by our network of fact-checking partners so fewer people see it and–when they do–they have the full context.

The post is actually a great example of Facebook’s biggest problem, which, believe it or not, isn’t misinformation. Facebook’s biggest problem is that the way it perceives itself is far different than the way everyone else does. 

For example, Facebook wants to be judged by the fact that it has taken action against the problem of content that could be misleading or harmful. Its critics judge the platform by the fact that it has so much of that content in the first place. 

If you have to “remove 18 million instances of Covid-19 misinformation,” that’s roughly 40,000 a day over the last 16 months. Add to that the 167 million, or 350,000 pieces of content every day that the company has “reduced the visibility” of, and it’s hard to argue that Facebook doesn’t have a very real problem.

That’s not to say that it isn’t working hard to do something about it, but a lot of content being shared is spreading conspiracy theories and fake news. It requires Facebook to have thousands of content moderators, and they still can’t catch it all. That’s the type of very big problem that comes from having a platform with almost 3 billion users.

Here’s the point–perception matters. It’s the old adage that we judge ourselves by our best intentions. Facebook sees itself not as the world does–as a place where your personal information is collected so you can be targeted by ads, and where inflammatory content is amplified and shared, causing potentially real-world harm–but as a noble cause that connects people across the globe. 

Facebook certainly isn’t the only business that faces the same challenge–the disconnect in perception. It is, however, a perfect example of what can go wrong when a company loses sight of how big that disconnect really is. That’s not about pointing fingers. It’s about accepting reality and figuring out how to better solve the problem.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *