On Friday, the company said it will pause testing the tool in order to gather more feedback and make improvements.
The plan centers on a new system that will, if it is eventually launched, check iOS devices and iCloud photos for child abuse imagery. It includes a new opt-in feature that would warn minors and their parents of sexually explicit incoming or sent image attachments in iMessage and blur them.
Apple’s announcement last month that it would begin testing the tool fit with a recent increased focus on protecting children among tech companies — but it was light on specific details and was swiftly met with outraged tweets, critical headlines and calls for more information. So on Friday, Apple ( said it would put the brakes on implementing the features. )
«Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,» the company said. «Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.»
In a series of press calls aiming to explain the planned tool last month, Apple stressed that consumers’ privacy would be protected because the tool would turn photos on iPhones and iPads into unreadable hashes, or complex numbers, stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple’s iCloud storage service. (Apple later said other organizations would be involved in addition to NCMEC.)
Only after a certain number of hashes matched the NCMEC’s photos, Apple’s review team would be alerted so that it could decrypt the information, disable the user’s account and alert NCMEC, which could inform law enforcement about the existence of potentially abusive images.
Many child safety and security experts praised the intent of the plan, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also said the efforts presented potential privacy concerns.
«When people hear that Apple is ‘searching’ for child sexual abuse materials (CSAM) on end user phones they immediately jump to thoughts of Big Brother and ‘1984,’» Ryan O’Leary, research manager of privacy and legal technology at market research firm IDC, told CNN Business last month. «This is a very nuanced issue and one that on its face can seem quite scary or intrusive.»
Critics of the plan applauded Apple’s decision to pause the test.
Digital rights group Fight for the Future called the tool a threat to «privacy, security, democracy, and freedom,» and called on Apple to shelve it permanently.
«Apple’s plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history,» Fight for the Future Director Evan Greer said in a statement. «Technologically, this is the equivalent of installing malware on millions of people’s devices — malware that can be easily abused to do enormous harm.»
Correction: A previous version of this story misstated the name of the digital rights advocacy group Fight for the Future.