Workers Who Review Child Sexual Abuse Material Online Pay a Heavy Toll
Content moderators reviewing child sexual abuse material face PTSD, sleep disruption, and anxiety, yet protections on paper rarely match what workers actually receive.

Behind every image of child sexual abuse removed from a social media platform is a human reviewer who had to look at it first. That person is typically a contractor, often outsourced to a third-party firm, and frequently has far fewer mental health protections in practice than the policies their employers publish.
Selena Scola filed a class-action lawsuit against Facebook in 2018, claiming she developed post-traumatic stress disorder after just nine months on the job reviewing graphic content. The suit alleged that the company "ignored its duty" to protect moderators who experienced mental trauma after viewing extreme material as part of their work. By 2020, the case had grown to represent more than 14,000 moderators. Facebook agreed to pay $52 million in a preliminary settlement, making each moderator eligible for at least $1,000, with others able to receive up to $50,000 in damages depending on diagnosis and documented injury.
The settlement was widely described as a landmark moment. It was followed almost immediately by a troubling reversal. With the ink still drying on that $52 million agreement, many outsourced moderators working through Accenture were told they must view some of the most disturbing content on the internet for an extra 48 minutes per day.
The psychological effects of this work are well-documented, if still not fully understood. Research has found that moderators frequently experience intrusive thoughts about the child sexual abuse content they have encountered, avoidance of children, and negative cognitive and emotional effects including cynicism, anxiety, and detachment. Middlesex University's Secondary Trauma Research group notes that secondary traumatic stress is characterized by feelings of isolation, dissociation, and sleep disturbance in response to working with traumatic material. Poor mental health is identified as one of the major reasons moderators leave their positions; long-term exposure has triggered depression, anxiety, and in severe cases, addiction to drugs and alcohol.
The structural conditions of the job compound those risks. Most reviewers are not employees of the platforms whose content they clean. They are hired through contractors, which can mean limited access to employer-sponsored mental health resources, lower wages, and reduced job security. A global trade union alliance has called on major tech companies to adopt mental health protections throughout their supply chains for precisely this workforce.
The International Centre for Missing & Exploited Children released a Model Framework in 2024 specifically designed to safeguard the mental health of content moderators exposed to child sexual abuse material and promote trauma-informed employment practices. The framework spans the entire moderator employment cycle from hiring and retention through to post-employment, emphasizing comprehensive training, a supportive work environment, and a strong commitment to prioritizing mental health. Whether platforms adopt it remains largely voluntary.
The scale of the underlying problem only continues to grow. In 2025, Thorn's Safer platform alone detected more than 3,840,000 files of potential novel child sexual abuse material across the platforms it serves. Each flagged file represents a decision point that can still require human eyes. A qualitative study published in the journal Cyberpsychology found that moderators' ability to adjust to the work was associated with suppressing emotions, a coping mechanism researchers warned could have serious longer-term consequences for social and mental wellbeing.
The labor sustaining child safety online is hidden, outsourced, and underprotected. The platforms that profit from the content ecosystem these workers police have yet to demonstrate that the gap between written policy and lived experience has meaningfully closed.
Sources:
Know something we missed? Have a correction or additional information?
Submit a Tip

