Meta, the parent company of Facebook and Instagram, has announced a major overhaul of its content moderation strategy, replacing its third-party fact-checking program with a collaborative initiative called Community Notes. This new system, inspired by a similar feature on X (formerly Twitter), aims to provide users with contextual information on potentially misleading posts.
“Starting in the US, we are ending our third-party fact-checking program and moving to a Community Notes model,” the company said in a statement.
Mark Zuckerberg, Meta’s CEO, described the change as a return to the company’s commitment to free expression. “It’s time to get back to our roots around free expression on Facebook and Instagram. The current system has too many mistakes and too much censorship,” Zuckerberg explained.
Joel Kaplan, Meta’s Chief Global Affairs Officer, added that the Community Notes initiative will empower users to collaboratively provide context on posts flagged as misleading. “This approach has proven successful on X, where users from diverse perspectives contribute to ensuring accuracy and transparency,” Kaplan said.
In addition to the moderation changes, Meta is revisiting restrictions on sensitive topics like immigration and gender, which Zuckerberg called “out of touch with mainstream discourse.” The company will also relocate its U.S.-based moderation teams to Texas to rebuild trust and streamline operations.
Zuckerberg criticized censorship laws in Europe and other regions, emphasizing the importance of collaborating with the U.S. government to resist global pressures for stricter content restrictions.
The Community Notes system will roll out gradually over the next month, marking a significant shift in Meta’s approach to balancing free speech and content integrity on its platforms.