'I was moderating hundreds of horrific and traumatising videos'

Published On Nov 10, 2024, 7:40 PM

The article discusses the challenging and often traumatizing work of social media moderators who are tasked with reviewing and removing disturbing online content. Despite advancements in AI, human moderators remain crucial due to the complex nature of the content they evaluate. The article highlights the mental health struggles faced by these moderators and ongoing legal actions for better support and compensation. As the role of AI in moderation evolves, there are concerns about its effectiveness and the potential job displacement of human moderators.

Stock Forecasts

Given the increasing scrutiny on social media platforms regarding user safety and the potential legal liabilities related to content moderation, companies are expected to either invest more in effective moderation resources, both human and AI, or face stricter regulations. This creates an opportunity for tech companies focusing on AI moderation tools and mental health support for moderators.

There may be increasing legal and operational costs for social media giants like Meta (Facebook/Instagram) and TikTok due to ongoing lawsuits and the need for better support for moderators. These costs could affect their profit margins, especially as regulatory scrutiny increases. This might lead to a negative sentiment towards investing in such companies in the short term.

Related News

Technology and loneliness are interlinked, researchers have found, stoked by the ways we interact with social media, text messaging and binge-watching.

META
PTON

The government says it wants to mitigate the "harm" social media is inflicting on children.

META
QRVO

The shift in policy, covering government agencies and contractors working on national security, is intended to promote “responsible and ethical” innovations, the company said.