After backlash from customers, Apple defends its new system to scan phones for child sexual abuse material (CSAM).
Critics warned it could be a “backdoor” to spy on people, and more than 5,000 people and organizations have signed an open letter against the technology.
Apple has therefore pledged not to “expand the system” for any reason.
Last week, Digital privacy campaigners warned that authoritarian governments could use technology to support anti-LGBT regimes or crackdown on political dissidents living in countries where protests have been deemed illegal.
Apple stated that it would not accept any request from any government to expand the system.
It published a question-and-answer document, saying it had numerous safeguards in place to stop its systems from being used for anything other than detecting child abuse imagery.
“We have been faced with demands to make and implement government-mandated changes that compromise the privacy of users in the past, and we have consistently refused these demands. Furthermore, it stated that it would continue to reject them in the future”
Apple has made concessions in the past to ensure that it can continue operating in other countries.
- Should encryption be curbed to combat child abuse?
- Facebook encryption ‘must not cause children harm.
The tech giant pulled 39,000 apps out of its Chinese App Store last New Year’s Eve as part of a crackdown by the authorities on illegal games.
Apple also stated that its anti-CSAM tool wouldn’t allow it to scan a user’s photo album. It can only scan photos shared via iCloud.
Based on a list of hashes of known CSAM photos provided by child safety organizations, the system will search for matches on the device.
Apple claims that it is almost impossible to flag innocent persons to the police falsely. It stated that “less than one in one trillion accounts would be incorrectly flagged each year.” A human review is available for positive matches.
Privacy advocates argue that Apple’s promise not to allow that technology to be used for other purposes is the only thing that will stop it from being.
For example, the digital rights group the Electronic Frontier Foundation said that “all it would take… is an expansion of the machine learning parameters to look for additional types of content”.
It warned that “that’s not a slippery slope” and pointed out that the system is fully built and waiting for external pressure to make any changes.
Apple also offered assurances about a new feature that will alert parents and children using linked family accounts when explicit photos are sent or received.
The company’s new features don’t use the same technology, and it won’t have access to private communications.
Privacy advocates were outraged by Apple’s announcement, but some politicians were open to the idea.
Sajid Javid, UK Health Secretary, said that it was time for other social media platforms, including Facebook, to follow his lead.
Reference: bbc.com