U.S.

West Virginia sues Apple over iCloud distribution of CSAM

WV attorney general JB McCuskey filed suit alleging iCloud enabled spread of child sexual abuse material, seeking damages and court-ordered detection and product changes.

Marcus Williams4 min read
Published
Listen to this article0:00 min
Share this article:
West Virginia sues Apple over iCloud distribution of CSAM
Source: bloximages.chicago2.vip.townnews.com

West Virginia Attorney General JB McCuskey filed a consumer-protection lawsuit in Mason County Circuit Court on Thursday, Feb. 20, 2026, accusing Apple Inc. of allowing child sexual abuse material to be stored and distributed on iCloud and other iOS services and of failing to adopt effective detection, reporting and product-design safeguards.

The complaint alleges Apple prioritized user privacy and end-to-end encryption in ways that protected predators, creating a public nuisance and straining West Virginia’s child-protection and public-health systems. McCuskey framed the harm in stark terms: “These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed. This conduct is despicable, and Apple’s inaction is inexcusable.” In a second statement he added, “Preserving the privacy of child predators is absolutely inexcusable. And more importantly, it violates West Virginia law. Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared.”

The suit advances multiple causes of action, including strict liability for a design defect, negligence, creation of a public nuisance, and violations of the West Virginia Consumer Credit and Protection Act. Plaintiffs seek statutory and punitive damages and injunctive relief that would compel Apple to implement more effective detection and reporting tools, redesign product features to reduce abuse, and take reasonable measures to remove the alleged public nuisance.

Central to the complaint are internal communications and product decisions that prosecutors say show Apple knew its platforms were being misused. The filing cites an internal 2020 message from Apple’s head of fraud that described iCloud as “the greatest platform for distributing child porn” and said the company “chose not to know.” The suit also traces the company’s technical pivot: work on an image-screening system known as NeuralHash around 2021 was delayed amid privacy concerns and ultimately scrapped in December 2022 in favor of broader end-to-end encryption, which Apple markets as Advanced Data Protection.

Apple defended its approach in a prepared statement cited in filings and press accounts, saying it has “implemented features that prevent children from uploading or receiving nude images and was ‘innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.’” The company also points to Communication Safety, a set of parental-control features that Apple says “intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls.”

AI-generated illustration
AI-generated illustration

The West Virginia suit draws comparisons with other tech companies: prosecutors note that Google and Microsoft check uploaded photos and attachments against databases of known child sexual abuse material identifiers provided by the National Center for Missing and Exploited Children, a practice Apple has largely avoided. Advocacy findings cited in the complaint and media reports say Apple has undercounted appearances of abuse material; police data reviewed by the U.K. group NSPCC indicated predators in England and Wales used Apple services more often than company disclosures suggested.

Legal experts say the case could set a precedent by testing whether state consumer-protection and nuisance laws can force changes to platform architecture and encryption practices. For West Virginia, a successful injunction could require Apple to adopt scanning and reporting measures prosecutors say would speed removals and help law enforcement, while a defense victory would reinforce tech firms’ discretion over privacy-oriented design choices.

The lawsuit arrives amid parallel litigation: a group of victims filed a separate suit in 2024 seeking $1.2 billion in damages against Apple for similar harms. The Mason County complaint is now the latest test of how courts will weigh child safety, privacy and platform responsibility.

Know something we missed? Have a correction or additional information?

Submit a Tip

Never miss a story.
Get Prism News updates weekly.

The top stories delivered to your inbox.

Free forever · Unsubscribe anytime

Discussion

More in U.S.