West Virginia Attorney General JB McCuskey today announced a lawsuit against Apple, accusing the company of knowingly allowing iCloud to be used to distribute and store child sexual abuse material (CSAM). McCuskey says Apple has chosen to “do nothing about it” for years.
“The privacy of child predators is absolutely inexcusable. More importantly, it violates West Virginia law. Since Apple has so far refused to police itself and do the morally right thing, I am filing this lawsuit to demand that Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared,” said Attorney General JB McCuskey.
According to the lawsuit (PDF), Apple describes itself as “the largest platform for the internal distribution of child porn,” but reports much less on CSAM than peers like Google and Meta.
In 2021, Apple announced new child safety features, including a system that would detect known CSAM in images stored on iCloud Photos. After backlash from customers, digital rights groups, child safety advocates and security researchers, Apple has decided to abandon its plans to detect child sexual abuse in iCloud Photos.
“Children can be protected without companies digging into personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people, preserve their right to privacy and make the internet a safer place for children and for all of us,” Apple said when it announced it would not be rolling out the feature.
Apple later explained that creating a tool to scan private iCloud data would “create new threat vectors that data thieves could find and exploit.”
West Virginia’s attorney general says Apple shirked its responsibility to protect children under the guise of user privacy, and that Apple’s decision not to deploy detection technology is a choice, not passive surveillance. The lawsuit suggests that because Apple has full control over the hardware, software and cloud infrastructure, it cannot claim to be an “unwitting, passive conduit of CSAM.”
The lawsuit seeks punitive damages and an injunction requiring Apple to implement effective CSAM disclosure measures.
Apple was also sued in 2024 for its decision to abandon CSAM detection. A lawsuit representing a potential class of 2,680 victims said Apple’s failure to implement CSAM monitoring tools caused permanent harm to victims. That lawsuit seeks $1.2 billion.
Note: Due to the political or social nature of the discussion on this topic, the discussion thread is located in our Political News forum. All forum members and web visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.