A dramatic shift in content moderation strategy has seen Meta CEO Mark Zuckerberg announce that the company will eliminate its fact-checking program in favor of a community-driven approach similar to X’s Community Notes feature.
This move is a big departure from Meta’s post-2016 content moderation policies, which were implemented amid concerns about misinformation on social media platforms. Zuckerberg acknowledged that while these policies were created with good intentions, they ultimately led to “too many mistakes” and “too much censorship.”
The new system will allow users across Facebook, Instagram, and Threads to add context to potentially misleading posts, requiring consensus from people with diverse viewpoints before any notes become visible. Unlike the current system’s full-screen warnings, these community notes will appear as subtle labels indicating additional information is available.
Alongside this change, Meta is implementing several other significant policy shifts. These include:
- Moving trust and safety teams from California to Texas, a move Zuckerberg suggests will help build trust in areas “where there is less concern about the bias of our teams.”
- Relaxing restrictions on discussions about immigration, gender identity, and other politically sensitive topics.
- Reintroducing more political content into users’ feeds with a “more personalized approach.”
- Maintaining automated moderation only for severe violations like terrorism, child exploitation, fraud, and drug-related content.
Meta’s automated systems for predicting potential policy violations and demoting content will be largely dismantled, with the company now requiring community reports before taking action on less severe infractions.
In explaining the timing of these changes, Zuckerberg cited recent U.S. elections as a “cultural tipping point” toward prioritizing free expression. He also took aim at governments and legacy media, accusing them of pushing for increased censorship of online content.
The community notes feature will roll out first in the US over the next few months, with Meta insisting that the system will require agreement between users with different perspectives to prevent biased ratings.