Apple facing lawsuit over dropped child sexual abuse detection features

The company initially tried to establish a plan to crack down on crimes against children in 2021, however, there were fears that the technology could be used as part of wider government surveillance.

Tech multinational Apple is at the centre of a lawsuit brought on behalf of an unnamed plaintiff, accusing the firm of dropping child sexual abuse matierial (CSAM) detection features and privacy-washing its obligations.

The 27-year-old plaintiff, who has filed under a pseudonym, stated that when she was an infant she was sexually assaulted by a relative who shared those images online and she now receives law enforcement notices, almost daily, saying that someone has been charged with possessing the content.  

This is not the first suit of its kind to be brought against Apple. Earlier this year, a minor plaintiff referred to as Jane Doe, filed with the US District Court for the Northern District of California, stating that she was coerced into making and uploading CSAM on an iPad, with images and videos stored over iCloud. 

Her class-action suit claimed that Apple was hiding behind privacy concerns in order to abdicate responsibility towards preventing the storage of CSAM on iCloud. 

Previously, in 2021, Apple attempted to introduce measures on iOS systems that would detect CSAM images, alert guardians of explicit content on a child’s device and notify law enforcement that CSAM was being stored over the cloud, however, there was backlash from privacy advocates.

A month after first approaching the topic, Apple announced that it would be postponing the introduction of new features, amid concerns that the advanced technology could be used to spy on anyone, deploying a campaign of widespread government surveillance. 

In 2022, the company launched the ‘communication safety in messages’ feature, which would alert children to the dangers of sending or receiving content that included nudity. This would blur the image, warn the child and offer information as to how they can reach out to trusted adults for help. 

The current lawsuit has cited that potentially 2,680 other people have been affected by Apple’s failure to deploy anti-CSAM protections and it calls upon the courts to compel Apple to compensate those harmed and change its practices. 

SiliconRepublic.com has reached out to Apple for comment. 

Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.

Related Content

Russia’s finance minister reveals bitcoin is being used to conduct foreign trade

Grab Genuine Lifetime Windows 11 And Microsoft Office Keys At Unbeatable Prices Starting From $13

Tiny speaker promises virtual surround sound with novel audio trickery

Leave a Comment