At the beginning of the month, Apple announced a new technology in the software running all their devices, aimed at child security. This new technology dubbed called NeuralHash, will be deployed to scan photos on-device before they are uploaded to iCloud, to check for images that might match any material that points towards child sexual abuse.
However, many tech experts including the head at WhatsApp, are fighting these new security measures, citing privacy concerns and possible user-harmful manipulation of the technology.
In their “Expanded Protection for Children” statement on their website, Apple say they have collaborated with child safety experts to bring a total of three features carrying technologies with communication tools for parents. They aim to give the guardians of kids under 13 years an informed role in monitoring sexually explicit images being searched, sent/received or recorded on their children’s devices.
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM (Child Sexual Abuse Material) image hashes.” reads part of their statement. This matching process determines if there is a match with what they have in the database.
As soon as Apple announced these upcoming updates, the tech world was up in arms, with individuals like Edward Snowden questioning especially the move to start scanning for images on-device. The concern is majorly in how far the technology can be harnessed, as the company already said it will be working with government authorities in reporting the cases they find depicting sexually explicit activities involving a child.
In an open letter being signed via GitHub, there is fear from thousands of users that this proposal by Apple introduces a backdoor that might undermine essential privacy protections guaranteed in Apple products.
In a full-blog statement, the Electronic Frontier Foundation noted that we are already staring at broader abuses of such a system.
“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts”
Apple has come out today to defend their controversial and seemingly invasive implementation, saying they are only concerned with detecting CSAM [child sexual abuse material] stored in iCloud. Their list of banned images is provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.
“This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data.” Wrote Apple.
On the question of authoritarian governments demanding the use of their technology for surveillance, the company said in an FAQ document that they will not bow to such pressure.
“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.” Apple.