Facebook Shifts Content Moderation to Its Users. Are You Ready?
Meta would like to introduce its next fact checker—one who will identify lies, make compelling corrections, and warn others about misleading content.
It’s you.
Mark Zuckerberg, CEO of Meta, announced on Tuesday that it ends most of the company’s moderation efforts, such as third-party fact-checking and content restrictions. Instead, he said, the company will hand over fact-checking responsibilities to everyday users under a model called Community Notes. which was popularized by X and allows users to leave fact checks or corrections in social media posts.
The announcement signals the end of an era of content moderation and the adoption of looser rules that even Mr. Zuckerberg acknowledged would lead to more false and misleading content on the world’s largest social network.
“I think it’s going to be a spectacular failure,” said Alex Mahadevan, director of a media literacy program at the Poynter Institute called MediaWise, who studied Community Notes on X. “The platform is now not responsible for everything that is said. . They can shift the responsibility onto the users themselves.”
Such a turnaround would have been unimaginable after the 2016 or even 2020 presidential elections, when social media companies saw themselves as reluctant warriors on the front lines of the disinformation war. Widespread lies during the 2016 presidential election sparked public backlash and internal debate within social media companies over their role in spreading so-called fake news.
In response, companies have poured millions into content moderation efforts, paid third-party fact-checkers, created complex algorithms to limit toxic content and issued a barrage of warning labels to slow the spread of lies – steps seen as necessary to restore public trust.
To some extent, these efforts worked: The researchers found that fact-checking labels were effective in reducing belief in lies, although they were less effective for conservative Americans. But the efforts have also made the platforms — and Zuckerberg in particular — political targets of President-elect Donald Trump and his allies, who have said content moderation is nothing more than censorship.
Now the political situation has changed. As Mr. Trump moves to take control of the White House and the regulatory agencies that oversee Meta, Mr. Zuckerberg has moved to repair his relationship with Mr. Trump. dinner at Mar-a-Lago, adding a Trump ally to Meta’s board of directors and donated $1 million to Trump’s inaugural fund.
“The recent election also feels like a cultural inflection point toward re-prioritizing speech,” Mr. Zuckerberg said in a video announcing the moderation changes.
Zuckerberg’s bet on using Community Notes instead of professional fact-checkers was inspired by a similar experiment at Company X, which allowed Elon Musk, its billionaire owner, to outsource the company’s fact-checking to users.
X now asks everyday users to spot lies and write corrections or add additional information to social media posts. The exact details of the Meta program are unknown, but on X, notes are initially only visible to users who have signed up for the Community Notes program. Once a note gets enough votes that it’s valuable, it’s added to a social media post for everyone to see.
“The dream of social networks is fully automated moderation, for which they, firstly, do not have to take responsibility, and secondly, do not have to pay anyone,” said Mr. Mahadevan, director of MediaWise. “So Community Notes is the absolute dream of these people—they were essentially trying to develop a system that would automate fact-checking.”
Mr. Musk, another Trump ally, was an early supporter of Community Notes. He quickly expanded the program after laying off most of the company’s trust and safety team.
Research has shown that Community Notes can help dispel some viral misconceptions. Researchers have found that this approach works best for topics on which there is broad consensus, such as misinformation about Covid vaccines.
In this case, the records “have become an innovative solution that provides accurate and reliable health information,” said John W. Ayers, associate chief of innovation in UC’s Division of Infectious Diseases and Global Public Health. San Diego School of Medicine, who wrote A report in April on the topic.
But users with different political views must agree to check facts before they are publicly added to a post, meaning misleading posts about politically divisive topics often go unchecked. MediaWise found that less than 10 percent of user-generated community notes ended up being posted in offensive messages. On sensitive topics like immigration and abortion, the numbers are even lower.
Researchers found that most posts on X receive the majority of traffic within the first few hours, but it can take several days for a community note to be approved so everyone can see it.
Since its debut in 2021, the program has attracted interest from other platforms. YouTube last year announced it was launching a pilot project allowing users to leave notes to appear under misleading videos. The usefulness of these fact checks is still being assessed by third-party evaluators, YouTube said in a blog post.
Meta’s existing content moderation tools appear to be overwhelmed by a torrent of lies and misleading content, but researchers found the measures to be quite effective. Study published last year in the journal Nature Human Behavior. showed that warning labels, like those used by Facebook to alert users to false information, reduced belief in lies by 28 percent and reduced the frequency of content sharing by 25 percent. The researchers found that right-wing users were much more distrustful of fact checks, but the interventions were still effective in reducing their belief in false content.
“All the research shows that the more speed bumps, essentially, the more friction on a platform, the less bad information you have spreading,” said Claire Wardle, an assistant professor of communications at Cornell University.
Researchers believe community fact-checking is effective when combined with in-house content moderation efforts. But Meta’s hands-off approach could be risky.
“The community-based approach is one piece of the puzzle,” said Valerie Wirtschafter, a fellow at the Brookings Institution who studied the Community Notes. “But it can’t be the only solution, and it certainly can’t just be implemented as a custom, seamless solution.”
2025-01-08 01:30:52