Apple Is Delaying Plans to Scan Phones for Child Abuse Material

Apple has announced that it’s officially delaying the rollout of its controversial plans to scan iPhones for child sexual abuse material (CSAM), The Verge reports.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple wrote in a statement.

It’s a notable — and rare — moment of self-reflection. Apple is listening to its critics.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the statement continues.

Advertisement

Last month, Apple announced that it planned to scan all data uploaded to its iCloud Photos service and compare hashes, which are unreadable digital representations of each image, to an existing database of CSAM. If it detected positive matches, the company would report them to the National Center for Missing and Exploited Children (NCMEC).

Apple’s announcement also included a new feature that would alert parents if their child were to ever receive or send sexually explicit photos and automatically blur them.

The announcement raised flags among many online privacy advocates, with many arguing that the move could set a troubling precedent for Apple scanning for other material in the future.

That would be an especially ominous development in countries where the government holds a tighter grip over what is being shared online, like India and China.

Advertisement

The Electronic Frontier Foundation argued in a statement at the time that “it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

For now, Apple is putting its plans to scan iPhones for CSAM on ice. It’s unclear what the tech giant’s future child safety features will end up looking like — but whatever they are, they’ll likely face similar scrutiny by privacy advocates.

READ MORE: Apple delays controversial child protection features after privacy outcry [The Verge]

More on Apple’s announcement: Apple Will Scan Every iPhone for Images of Child Sexual Abuse

Advertisement

Share This Article

Go to Source