Apple bowed to privacy and civil liberties advocates Friday when it agreed to delay and modify a controversial plan to scan users’ photos for child pornography.
The company’s tool, called “neuralMatch,” would scan images on Apple users’ devices before they’re uploaded to iCloud. A separate tool would sift through users’ encrypted messages for child pornography.
After Apple announced the effort in August, privacy advocates hit back at the company.
The Electronic Frontier Foundation, a digital privacy group, racked up more than 25,000 signatures on a petition against the tool, while the American Civil Liberties Union said in a letter that the tool would “censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
Critics say the tool could easily be misused by repressive governments to track and punish users for all kinds of content besides child pornography, including political content.
And some privacy activists have pointed to Apple’s seemingly accommodating relationship with the government of China, where the vast majority of its devices are manufactured, as evidence that the company would allow the tool to be used for political repression.
In a call with reporters prior to Friday’s announcement, an Apple representative was asked whether the company would exit the Chinese market if authorities demanded the company to use the scanning tool for other purposes. The Apple representative replied that such a decision would be “above their pay grade,” Vice reported.
In Friday’s announcement, Apple did not provide specifics on how it would change its child protection features, but acknowledged the backlash.
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement to multiple media outlets. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
The feature was originally scheduled to be rolled out this year. It’s now unclear when the company plans to release the features or how they’ll be changed.
Apple has said that the tool will flag only images that are already in a database of known child pornography, meaning parents who take photos of their children bathing would not be flagged, for example.
Another one of Apple’s features would have scanned images sent to minors through iMessage for porn, blurring such images and sending a warning to the child.
Johns Hopkins University cybersecurity researcher Matthew Green, a critic of Apple’s features, said Friday’s move “looks promising.”
“Talk to the technical and policy communities before you do whatever you’re going to do,” Green wrote in a Twitter thread addressing Apple. “Talk to the general public as well. This isn’t a fancy new Touchbar: it’s a privacy compromise that affects 1 [billion] users.”
Electronic Frontier Foundation Executive Director Cindy Cohn said in a statement to The Post that Friday’s delay, while welcome, doesn’t go far enough.
“The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely,” Cohn said. “These features would create an enormous danger to iPhone users’ privacy and security, offering authoritarian governments a turnkey mass surveillance system to spy on citizens.”
“The enormous coalition that has spoken out will continue to demand that user phones — both their messages and their photos — be protected, and that the company maintain its promise to provide real privacy to its users,” Cohn added.