West Virginia is suing Apple for allegedly allowing child abuse to spread on iCloud

West Virginia has filed a lawsuit against Apple, accusing the company of allowing the distribution and storage of child sexual abuse material (CSAM) on iCloud. In a lawsuit filed Thursday, West Virginia Attorney General JB McCuskey alleges that by abandoning CSAM’s detection system in favor of end-to-end encryption, iCloud has become “a secure seamless pathway for the possession, protection and distribution (of) CSAM, in violation of state consumer protection laws.”

Now West Virginia claims that Apple “knowingly and intentionally designed its products with deliberate indifference to highly preventable harm.” McCuskey believes other states could also take legal action against Apple, as he told reporters during a press conference that he thinks they will “see the leadership that this agency has taken” and “join us in this fight.”

The suit alleges that Apple sent 267 child sexual abuse reports to the National Center for Missing and Exploited Children, fewer than Google’s 1.47 million reports and Meta’s more than 30.6 million. It also cites an internal memo between Apple executives where Apple’s head of fraud, Eric Friedman, is said to have stated that iCloud is “the largest platform for the distribution of child pornography”.

Many online platforms, including Google, Reddit, Snap, Meta and others, use tools like Microsoft’s PhotoDNA or the Google Content Safety API to detect, remove and report child sexual abuse in photos and videos submitted through their systems. Apple doesn’t currently offer these features, but has since rolled out some features aimed at keeping kids safe, including parental controls that require kids to get permission to text new numbers, as well as a tool that automatically blurs out nude images for minors on iMessage in other apps.

“At Apple, protecting the safety and privacy of our users, especially children, is central to what we do,” Apple spokesman Peter Ajemian said in an emailed statement. The Verge. “We innovate every day to combat ever-evolving threats and maintain the safest and most trusted platform for kids.”

But McCuskey says Apple’s safeguards aren’t enough to protect children. “Apple has knowingly designed a set of tools that dramatically reduce friction in the possession, collection, protection, and dissemination of CSAM, while creating an encryption shield that makes it much more likely that bad actors will use Apple to protect their illegal activities,” the lawsuit alleges.

February 19 Update: Added Apple statement.

Leave a Comment