Apple announced a new tool in August for combating child exploitation. The tool became highly critical when privacy implications came into play. Apple is now taking a step back on implementing the app.
Apple made headlines – and not the good kind – last month when it announced a test of a new tool aimed at combating child exploitation. Critics quickly decried the feature’s potential privacy implications, and now Apple is taking a long pit stop before moving forward with its plans.
Apple will be further testing the took and make a plan for relaunch. Incused will be an op- in feature that warns parents of minor children about sexually explicit incoming or sent image attachments in iMessage and blur them. But the privacy concerns were high as Digital rights group Fight for the Future called the tool a threat to “privacy, security, democracy, and freedom,” and requested Apple to remove the tool permanently.
Director Evan Greer said in a statement, “Apple’s plan to conduct on-device scanning of photos and messages is one of the most dangerous proposals from any tech company in modern history,” “Technologically, this is the equivalent of installing malware on millions of people’s devices – malware that can be easily abused to do enormous harm.
Apple stressed that consumers’ privacy would be protected because the tool would turn photos on iPhones and iPads into unreadable hashes, or complex numbers, stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple’s iCloud storage service as reported by CNBC.
Apple said that the system would only flag cases where users had approximately 30 or more potentially illicit pictures.
The app works with the Siri digital assistant for reporting child abuse and accessing resources related to fighting CSAM to scan devices operated by children for incoming or outgoing explicit images along with a new feature for iCloud Photos that will analyze a user’s library for explicit images of children. If these pictures were discovered in their library, Apple gets alerted, do a human review to verify the contents, and then report the user to law enforcement.
Privacy advocate The Electronic Frontier Foundation warned that the technology could be used to track things other than child pornography, opening the door to “broader abuses.”