West Virginia’s attorney general said on Thursday that Apple knowingly allowed its iCloud service to be used to store and share child sexual abuse material, in what is believed to be the first lawsuit filed by a state or federal agency against the company over the issue.
The attorney general, John B. McCuskey, said that by declining to use certain tools that recognize photos and videos containing child sexual abuse material, Apple aided the spread of the material and violated local consumer protection law.
“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” Mr. McCuskey said in a statement. “This conduct is despicable, and Apple’s inaction is inexcusable.”
West Virginia’s lawsuit spotlights concerns that the privacy of Apple’s iCloud allows illegal material to be stored and shared more easily than on social media apps like Facebook and Instagram. The suit seeks monetary damages and changes to Apple’s practices for detecting child sexual abuse material and designing safe products.
“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do,” Apple said in a statement. “We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids.” The company added that it offered parental controls, including a feature that warns children if they receive or try to send content with nudity.
Apple has been sued before by victims for failing to limit child sexual abuse material on iCloud. In December 2024, a group of victims sought more than $1.2 billion in damages from the company, which filed a dismissal motion that is still pending. Another lawsuit, involving a 9-year-old girl in North Carolina, was filed in August 2024 but largely dismissed last year.
“We need somebody to hold them accountable and make them answer the question of whether or not they believe this belongs on this platform,” said Sarah Gardner, the founder of Heat Initiative, a child safety group that helped bring the lawsuit in December 2024.
For years, Apple’s reports of child sexual abuse material have been a tiny fraction of the totals reported by its peers. In 2023, for example, Meta made more than 30 million reports of suspected sexual abuse material to the National Center for Missing & Exploited Children, a federal clearinghouse. Google made more than one million reports. Apple made 267.
West Virginia’s lawsuit is the latest example of a novel legal strategy against big technology companies. Section 230 of the Communications Decency Act of 1996 shields companies from legal liability for what their users post. But recent lawsuits have argued that technology companies are subject to personal injury liability for creating defective products that harm users. Meta and YouTube are currently on trial in California state court, accused of encouraging social media addiction.
In 2009, Microsoft helped create a tool for recognizing photos and videos containing child sexual abuse material, called PhotoDNA, which companies including Google and Dropbox then adopted. Apple declined to use PhotoDNA.
In 2019, an investigation by The New York Times found that technology companies failed to police abusive material on their platforms. Reporting by The Times led Eric Friedman, an executive responsible for fraud protection at Apple, to tell a colleague that he thought the company was underreporting child sexual abuse material.
“We are the greatest platform for distributing child porn,” Mr. Friedman said in a text message in 2020. He said that was because the company had put a priority on privacy over safety.
In 2021, Apple introduced its own system for detecting child sexual abuse material, using a technique called NeuralHash. But cybersecurity experts warned that the system could endanger user privacy and give governments access to iPhones. Shortly after, Apple dropped its plans for the system.
Kalley Huang is a Times reporter in San Francisco, covering Apple and the technology industry.
The post West Virginia Claims That Apple Allows Child Sexual Abuse Material appeared first on New York Times.




