Meta, the social media giant led by Mark Zuckerberg, announced plans on Tuesday to discontinue its third-party fact-checking program across its platforms, including Facebook, Instagram, and Threads.
In its place, the company will implement a Community Note program, a system similar to the one adopted by Elon Musk’s platform, X (formerly Twitter).
Joel Kaplan, Meta’s Chief Global Affairs Officer, shared the news in a blog post, stating that the new Community Note program will first roll out in the United States. Kaplan explained that while the original fact-checking initiative aimed to provide users with additional context from independent experts on viral content, it led to unintended consequences.
“Over time we ended up with too much content being fact-checked that people would understand to be legitimate political speech and debate,” Kaplan said. “Our system then attached real consequences in the form of intrusive labels and reduced distribution. A program intended to inform too often became a tool to censor.”
READ ALSO: Tinubu reaffirms commitment to African progress
The shift to Community Notes, Kaplan suggested, would allow for a more balanced and community-driven approach. He highlighted the success of the system on X, where users collaborate to add context to potentially misleading posts. “We think this could be a better way of achieving our original intention of providing people with information about what they’re seeing – and one that’s less prone to bias,” he added.
Kaplan outlined that the Community Notes on Meta’s platforms will not be written or decided upon by Meta itself but by contributing users. To prevent bias, these notes will require consensus from individuals with diverse perspectives. Meta also plans to maintain transparency regarding how different viewpoints shape the Notes displayed in its apps.
The decision to move away from third-party fact-checking comes after Meta faced criticism for its stringent content moderation, which included the temporary suspension of former U.S. President Donald Trump following the Capitol riot on January 6, 2021. Trump had accused Meta of censoring conservative voices, a sentiment echoed by other critics of the platform’s policies.
In December 2024, Meta acknowledged that while it removed millions of pieces of content daily, only a small fraction accounted for less than 1% of total content, with some removal actions potentially being mistakes.
With the new Community Notes program, Meta aims to foster a more user-driven, less biased approach to content moderation, allowing users to engage freely while providing essential context for the content they encounter.



