Apple claims that its announcement of automated tools detecting child sexual abuse on the iPhone/iPad was “Jumbled pretty badly.”
The company announced new image detection software on 5 August to alert Apple if illegal images are uploaded into its iCloud storage.
Privacy groups criticized the news, some claiming that Apple had created a security loophole in its software.
According to the company, its announcement was widely misunderstood.
Apple said:
“We wish that this had come out a little more clearly for everyone,” software chief Craig Federighi in an interview with the Wall Street Journal.
He stated that it was a recipe for confusion to introduce two features at once.
What are the latest tools?
Apple has announced two new tools to help children. They will be available in the US for the first time.
Image detection
When a user uploads photos, the first tool will identify child sex abuse material (CSAM).
The US National Center for Missing and Exploited Children maintains a database containing known images of child abuse. They are stored as hashes, a digital “fingerprint” of illegal material.
These images are already checked by cloud service providers like Microsoft, Google, and Facebook to ensure that people aren’t sharing CSAM.
Apple implemented a similar process but stated that it would match images on an individual’s iPhone or iPad before uploading them to iCloud.
Federighi stated that the iPhone wouldn’t be looking for pornography or photos of children in the bathtub.
He said that the system could not match “exact fingerprints,” specific images of child sexual abuse.
Apple will flag accounts that contain images that are similar to child abuse fingerprints.
Federighi stated that a user would need to upload at least 30 images matching the criteria before activating the feature.
Message filtering
Apple announced a parental control feature that parents could activate on their children’s accounts in addition to its iCloud tool.
The system will check photos sent to or by the child via Apple’s iMessage app is activated.
If the machine-learning system determined that a photo contains nudity, it would warn the child and obscure the image.
Parents can choose to be notified if their child views the photo.
Criticism
Privacy groups expressed concern that this technology could be used to spy on citizens by authoritarian governments.
WhatsApp’s Will Cathcart, the WhatsApp chief, called Apple’s move “very troubling,” while Edward Snowden, a US whistleblower, called the iPhone a spyPhone.
Federighi stated that the “soundbyte,” which spread immediately after the announcement, was that Apple scanned iPhones to find images.
He told the Wall Street Journal that “that is not what’s happening.”
“We feel strongly and positively about what we do and can see that many people have misunderstood it.”
These tools will be available in the next version of iOS and iPadOS, expected to be released later in the year.
Reference: BBC