Facebook Shifts Content Moderation to Its Users. Are You Ready?

Published On Jan 7, 2025, 7:14 PM

Meta Platforms Inc., led by CEO Mark Zuckerberg, is eliminating professional fact-checkers and will instead rely on users to moderate content on its platforms. This move, inspired by similar systems on X (formerly Twitter), could potentially lead to an increase in misinformation as Meta acknowledges that this shift may reduce the quality of information shared on its platforms.

Stock Forecasts

META

Negative

The shift to user-based content moderation may facilitate a rise in misinformation, which could negatively impact user trust and engagement. Furthermore, a reduction in professional oversight could lead to regulatory scrutiny, affecting investor sentiment.

Related News

While Republicans praised the move, some groups cautioned that effectively ending Meta’s fact-checking program would also lead to more conspiracy theories.

The social networking giant will stop using third-party fact-checkers on Facebook, Threads and Instagram and instead rely on users to add notes to posts. It is likely to please President-elect Trump and his allies.

Fact-checking groups that worked with Meta said they had no role in deciding what the company did with the content that was fact-checked.