The lawsuit alleges Apple has allowed iCloud to be a ‘secure avenue’ for storing and distributing CSAM.
West Virginia has filed a lawsuit against Apple, accusing the company of enabling the distribution and storage of child sexual abuse material (CSAM) in iCloud. Filed on Thursday, West Virginia Attorney General JB McCuskey claims that Apple’s decision to prioritize end-to-end encryption over a CSAM detection system has turned iCloud into a “secure frictionless avenue for the possession, protection, and distribution of CSAM,” violating state consumer protection laws.
Initially, Apple planned to introduce a system in 2021 that checked iCloud photos against a list of known CSAM images. However, after backlash from privacy advocates concerned about potential surveillance, Apple halted the development a year later. Apple’s software head Craig Federighi stated to The Wall Street Journal that they are focusing on preventing child sexual abuse before it occurs.
West Virginia now claims Apple “knowingly and intentionally designed its products with deliberate indifference to avoidable harms.” McCuskey suggests other states might also pursue legal actions against Apple, hoping they will follow West Virginia’s lead.
The lawsuit claims Apple reported 267 CSAM incidents to the National Center for Missing & Exploited Children, compared to Google’s over 1.47 million and Meta’s more than 30.6 million. The lawsuit references an internal message from Apple’s fraud head Eric Friedman, reportedly labeling iCloud as the “greatest platform for distributing child porn.”
Many platforms like Google, Reddit, Snap, and Meta use Microsoft’s PhotoDNA or Google’s Content Safety API to detect, remove, and report CSAM. Apple lacks these capabilities but has introduced child safety features like requiring children to get permission to text new numbers and automatically blurring nude images for minors in iMessage. McCuskey contends these measures are insufficient.
The lawsuit argues that Apple has created tools that make it easier to possess, collect, protect, and distribute CSAM while offering an encryption shield that could be exploited by bad actors.
