Why Meta fired its fact-checkers and what you can do about it
Meta CEO Mark Zuckerberg recently announced that the company would fire its fact-checkers and instead rely on its users, with the help of AI, to police Facebook and Instagram for false or misleading posts. The company will also move its content moderation team from California to Texas, lift restrictions designed to protect immigrants and LGBTQ people from hate speech and 鈥渄ial back鈥 penalties for rule-breakers.
鈥淲hat started as a move to be more inclusive has increasingly been used to shut down opinions,鈥 Zuckerberg said posted Tuesday. 鈥淚t has gone too far.鈥
The announcement has drawn praise from those who see it as a win for free speech and ire from critics who worry it will unleash a torrent of misinformation and hateful slurs.
Nathan Schneider, assistant professor of media studies, sees it as a wake-up call to users that it may be time to take back some control over social media platforms.
鈥淭his should, above all, be a reminder that we have entrusted a few companies that are way too powerful with the stewardship of our public discourse. And this is not an acceptable arrangement,鈥 he said.
兔子先生传媒文化作品 Today sat down with Schneider to get his take on what happened and what could happen next.
What just happened?
We鈥檙e in a full-circle moment right now. In the wake of the first Trump election in 2016, lots of people were criticizing Facebook, now Meta, for spreading misinformation. So the company invested heavily in a complicated, bureaucratic operation to deal with it. Now, we are seeing a profound bow to political pressure to reverse those steps. At the same time, there鈥檚 also a piece of an old Silicon Valley dream alive in this decision 鈥 the idea that we don鈥檛 need to pay people and third-party organizations to check facts, that we can do it with technological systems and crowdsourcing.
What do you make of this?
On its surface, this idea of crowdsourcing fact-checking has a lot of appeal to it. It seems to be giving power back to the users. The basic idea that one company with one rule book should control the speech of the world has always been absurd. But the question is: Is the company really going to give up that power? I think the answer is almost certainly 'no'. I worry that what's happening at Meta is something very similar to what we've seen at X: Under the guise of free speech and community control, we will see a consolidation of power and a relinquishment of responsibility.
Meta is moving to a 鈥渃ommunity notes鈥 system, similar to X鈥檚. Doesn鈥檛 that give more control to the user?
Community Notes on X does allow users to add fact-checks to a post. But whether they actually appear and are prioritized depends on an algorithm that the company controls. Already, on X, we see a strong leaning toward partisan and often right-wing voices. It鈥檚 replacing a bureaucracy for regulating speech with an algorithm that does it. And algorithms can make the consolidation of power easier.
Will we see more misinformation?
If we look at the experience with Community Notes on X, I think it鈥檚 very likely that this move will lead to more widespread misinformation and disinformation. X has made its network very difficult for researchers to study. But so far, external researchers have found that the majority of notes on misinformation are and that the system with misleading posts. If that is the model that Meta is promising to follow, this is a troubling development.
Will we see more slurs against vulnerable populations here?
Rightly, many people are terrified. The company has been quite explicit that it is committed to tolerating and normalizing the discourse of the far right, which includes denying the dignity of people in many communities, particularly queer folks. This move reflects the kind of naked power that this company has always been able to exert over speech, and its ability to determine what the bounds of acceptable speech in society are. As the political winds shift, the company appears to be embracing that shift across its networks. We should be asking ourselves whether we can continue to place so much trust in a company that can abruptly remove protections in this way.
What opportunities do you see?
What鈥檚 hopeful to me is a growing movement of people embracing other kinds of social networks like Mastodon and Bluesky. These networks rely less on any single company and more on open protocols, just like the web itself. This means that users can have more control over their data and develop their own interfaces and algorithms. None of these technologies is a panacea, but they open the door for us to shift the question away from what we should hope Mark Zuckerberg will do and toward what we could do ourselves to change our online lives for the better.
What can people do now?
Social media is not something anyone can change on their own. I think it's worth starting conversations with the people and organizations that you actually want to connect with and asking: Hey, where could we go that we would feel really at home in? And what kinds of steps can we take together to get there? Ultimately, these choices are social, and any moves we make to better spaces will have to happen collectively.
What would these new social networks look like?
This year, my lab is running what we call the Open Social Incubator. We are working with 10 communities from around the world to explore what it takes to move to emerging networks that are more decentralized and democratically-governed. In many respects, the future of a healthier social media could look more like the places we鈥檝e relied upon for information in the past. You could look at an example like Wikipedia 鈥 a widely trusted utility on the internet that is mission-driven and organized as a nonprofit. Or maybe your local library could anchor a space on the internet for social conversations. Back in 2017, I co-founded , a cooperatively-governed Mastodon server where the users can make their own decisions about moderation policies. Although it took a lot of time to develop, I now reap the rewards in having a really healthy, pleasurable home for my online life. I hope that in the future more people will have that experience.
兔子先生传媒文化作品 Today regularly publishes Q&As with our faculty members weighing in on news topics through the lens of their scholarly expertise and research/creative work. The responses here reflect the knowledge and interpretations of the expert and should not be considered the university position on the issue. All publication content is subject to edits for clarity, brevity and university style guidelines.