New iPhone child safety feature is ‘dangerous’: Experts give 7 reasons to Apple – Times of India
Apple last week announced a new child safety feature that it intends to roll out with the upcoming iOS 15 update. The feature will allow Apple to identify child pornographic images stored in iPhones and other devices connected to a user’s iCloud account like iPad, Apple Watch and Macs. The new child safety will be initially rolled out in the US. Apple has said that it may be rolled out to other countries as well in the future depending on the local rules and whether the government wants such a tracking system in place or not.
The system, if implemented, will allow Apple to manually intervene and block a user’s Apple ID if child porn is detected. Also, in the US, Apple may alert the police and National Center for Missing and Exploited Children (NCMEC) about potential abuse. If child porn content is found in a kid’s iPhone then their parents will be notified of the same.
While Apple’s intention is good, there’s a huge concern around what happens if people get falsely accused or local governments force Apple to use the same technology to snoop on potential political opponents, protestors and whistleblowers.
Security and privacy experts, cryptographers, researchers, professors, legal experts and individuals have signed an open letter to not implement this feature as it would undermine user privacy and end-to-end encryption. The biggest concern is that “both checks will be performed on the user’s device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user’s privacy,” said the letter.
Here are some reasons as to why experts don’t want Apple to implement the new child safety feature
Apple is opening the door to broader abuses: Electronic Frontier Foundation
“It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses […] That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change,” it said.
Apple’s changes in fact create new risks to children and all users: The Center for Democracy and Technology
“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” says Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”
What if child porn tracking is extended to state-specific censorship: Open Privacy Research Society
“If Apple is successful in introducing this, how long do you think it will be before the same is expected of other providers? Before walled-garden prohibits apps that don’t do it? Before it is enshrined in law? How long do you think it will be before the database is expanded to include “terrorist” content”? “harmful-but-legal” content? state-specific censorship?,” warned Sarah Jamie Lewis, Executive Director of the Open Privacy Research Society.
‘Apple is already bending to local laws’
“Apple sells iPhones without FaceTime in Saudi Arabia, because local regulation prohibits encrypted phone calls. That’s just one example of many where Apple’s bent to local pressure. What happens when local regulations in Saudi Arabia mandate that messages be scanned not for child sexual abuse, but for homosexuality or for offenses against the monarchy?,” said Dr. Nadim Kobeissi, a researcher in Security & Privacy issues.
‘Apple’s new child safety feature will lead to global abuse’
The Electronic Frontier Foundation in a statement said, “Take the example of India, where recently passed rules include dangerous requirements for platforms to identify the origins of messages and pre-screen content. New laws in Ethiopia requiring content takedowns of “misinformation” in 24 hours may apply to messaging services. And many other countries—often those with authoritarian governments—have passed similar laws. Apple’s changes would enable such screening, takedown, and reporting in its end-to-end messaging. The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers.”
Apple is rolling out mass surveillance to the entire world: Snowden
“No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—without asking,” tweeted Edward Snowden.
What will happen when spyware companies find a way to exploit this software: WhatsApp CEO Will Cathcart
WhatsApp CEO Will Cathcart said, “Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world. Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy.
Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out. Why not? How will we know how often mistakes are violating people’s privacy? What will happen when spyware companies find a way to exploit this software? Recent reporting showed the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?”
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.