Meta Shifts to Community Based Content

News Desk

Islamabad: On Tuesday, social media giant Meta announced the termination of its U.S. fact-checking program, opting for a community-based system similar to X’s “Community Notes.” The company also revealed plans to relax restrictions on discussions around contentious topics like immigration and gender identity.

This decision marks a significant policy shift for Meta, reversing its long-standing emphasis on active content moderation. CEO Mark Zuckerberg has faced criticism from conservatives over allegations of censorship, despite advocating for more stringent content control on the platform.

In a video statement, Zuckerberg emphasized the need for more free expression on Meta’s platforms. “We’ve reached a point where it’s just too many mistakes and too much censorship.

It’s time to get back to our roots around free expression,” he said. “We’re going to focus on reducing mistakes, simplifying our policies, and restoring free expression. We’ll tune our content filters to require much higher confidence before taking down content.”

The end of Meta’s fact-checking program, which was launched in 2016, caught several partner organizations by surprise. “We didn’t know this was happening, and it comes as a shock to us,” said Jesse Stiller, managing editor at Check Your Fact. Other partners, such as Reuters, AFP, and USA Today, did not immediately comment.

Read More: https://thepenpk.com/samsungs-foldable-phones-set-for-2025-evolution/

Meta’s independent Oversight Board, however, welcomed the change.

The policy revisions will impact Facebook, Instagram, and Threads, three of Meta’s most popular platforms, which collectively serve over 3 billion users globally. Zuckerberg has recently expressed regret over certain moderation decisions, particularly concerning COVID-19 content. Additionally, Meta made headlines by donating $1 million to former President Donald Trump’s inaugural fund, breaking with its previous stance on political donations.

Critics of the move voiced concerns about the risks of increased misinformation. Ross Burley, co-founder of the nonprofit Centre for Information Resilience, called it “a major step back for content moderation at a time when disinformation and harmful content are evolving faster than ever.” He added that the shift seemed more focused on political appeasement than sound policy.

Community Notes Model

The success of Meta’s new approach remains to be seen. Elon Musk’s platform, X, is currently under investigation by the European Commission for its handling of illegal content and information manipulation, including its “Community Notes” feature, which Meta is now adopting.

Meta plans to roll out Community Notes in the U.S. over the next couple of months and improve the model throughout 2025. This feature will allow users to flag posts that may be misleading or lack context, rather than relying on independent fact-checking organizations to manage this responsibility.

Unlike previous programs, Meta will not directly decide which Community Notes appear on posts. Instead, users will play a central role in determining the notes’ visibility. “It’s a smart move by Zuck and something I expect other platforms will follow,” said X CEO Linda Yaccarino in a post.

Additionally, Meta will shift its trust and safety teams, responsible for overseeing content policies, from California to Texas and other U.S. locations. The company said its automated systems would focus on removing illegal and high-severity content, such as terrorism and drugs.

As Meta continues to overhaul its content moderation strategy, it remains to be seen how these changes will impact the platforms’ role in shaping online discourse.

Comments are closed.