West Virginia Sues Apple, Says iCloud Is ‘Largest Child Porn Distribution Platform’

West Virginia’s attorney general believes iCloud is the largest platform ever created for the distribution of child porn, and is the first government to sue Apple after a previous class-action lawsuit failed.

McCuskey’s office also cited a 2020 text message from Apple’s then head of anti-fraud that Apple’s decisions about how it handles iCloud and image storage make it “the largest platform for the distribution of child porn.”

In the statement to ReutersAttorney General JB McCuskey argued that Apple’s inaction was “inexcusable”.

“These images are a permanent record of a child’s trauma, and that child is re-victimized every time the material is shared or viewed,” McCuskey said in a statement.

His office is now statutorily seeking compensation and punitive damages. The suit also asks a Mason County Circuit Court judge to force Apple to take new measures to disclose the offending material.

McCuskey’s office also called the case the first of its kind by a government agency related to the distribution of child sexual abuse material (CSAM) on Apple’s servers.

While Apple has previously denied any wrongdoing in similar lawsuits, it has yet to publicly respond this time.

It’s unclear what evidence West Virginia will provide. It’s also unclear how this will prove financial damages or legal violations by Apple. A public host is one thing in the eyes of the law, as are X and Grok generated by CSAM.

Legal liability for non-public data repositories is unclear. Section 230 of the Communications Decency Act will likely be invoked by Apple to try to get the lawsuit dismissed, as it has with others. How this will be done and how Section 230 will be applied to non-public data repositories that can be browsed by companies that do not create content remains to be seen.

An abandoned Apple solution

Apple’s attempts to deal with CSAM sparked controversy in 2021 when the company announced plans for an automated detection system. This system would check hashed versions of files stored on iCloud against hashes of known CSAM material.

Files designated as CSAM would then be reported to the National Center for Missing and Exploited Children. But the plans drew the ire of privacy advocates, who argued the system could be modified by governments to identify other kinds of material.

Apple confirmed that it had postponed its CSAM detection plans in December 2022. In August 2023, Erik Neuenschwander, Apple’s director of privacy and child safety, sought to explain the decision.

Neuenschwander argued that scanning each user’s iCloud data would open the door for data thieves and other threat actors. He added that the move would be a “slippery slope of unintended consequences”.

Confused attitude

Apple’s current system includes a number of systems it says help protect against CSAM distribution. Images of sensitive material are automatically blurred in devices used by children, for example.

While welcome, features like this only work if the recipient doesn’t want to see such material. It does little to deter those who deliberately share CSAM through Apple’s servers.

So far, Apple has been cautious about what further steps it is taking to prevent the spread of CSAM. The company also reports the CSAM material it discovers to the authorities, but again, little is known about how this works, with Apple known to report the least between it, Google and Facebook.

Apple’s position in the market is to prioritize privacy. This has so far prevented the company from taking traditional – and publicly vocal – steps to identify CSAM.

Google, Microsoft and other platform holders routinely scan photos and email attachments for CSAM identifiers.

Apple is likely to use Section 230 of the Communications Decency Act in its defense. The law offers some protection to Internet companies and prevents them from being held liable for their user-generated content, provided that a good faith effort is made to moderate publicly oriented user-generated content.

However, it is not clear whether a company can claim that it is not responsible for user content and at the same time report the same content to the authorities.

Leave a Comment