Apple backs down plans for new child safety tools after private backlash

On Friday, the company said it will stop testing the tool for more feedback and improvement.

The plan focuses on a new system that, if finally launched, will check iOS devices and iCloud photos for child abuse images. It includes a new admissions feature that would warn minors and their parents of attachments of sexually explicit or sexually explicit images sent to iMessage and blur them.

Apple’s announcement last month that it would begin testing the suitability of the tool with a recent increased focus on child protection among tech companies, but took specific details into account and quickly came across outraged tweets. , critical headlines and calls for more information.
So on Friday, apple (AAPL) said it would slow down the implementation of the functions.

“Last month we announced plans for functions aimed at protecting children from predators who use communication tools to recruit and exploit them and limit the dissemination of child sexual abuse material,” the company said. “Based on feedback from clients, advocacy groups, researchers and others, we have decided to devote additional time over the next few months to collecting input and making improvements before posting these very important child safety features.”

As Apple’s plan to combat child abuse was ruined
In a series of press calls aimed at explaining the tool planned last month, Apple stressed that consumer privacy would be protected because the tool would turn photos of iPhones and iPads into unreadable hashes or numbers. complexes stored on user devices. These numbers would be compared to a hash database provided by the National Center for Missing and Exploited Children (NCMEC) once the images were uploaded to Apple’s iCloud storage service. (Apple later said other organizations would participate in addition to NCMEC).

Only after a certain number of hashes matched the NCMEC photos would Apple’s review team be alerted so that it could decrypt the information, deactivate the user’s account, and alert NCMEC, which could inform law enforcement order on the existence of potentially abusive images.

Many child safety and security experts praised the plan’s intent, acknowledging the ethical responsibilities and obligations a company has over the products and services it creates. But they also said the efforts presented possible privacy issues.

“When people feel that Apple is‘ looking for ’child sexual abuse (CSAM) materials on end-user phones, the thoughts of Big Brother and‘ 1984 ’immediately arise,” said Ryan O’Leary, head of Privacy research and legal technology market research firm IDC told CNN Business last month. “This is a very nuanced topic that may seem quite terrifying or intrusive.”

Critics of the plan applauded Apple’s decision to stop testing.

The digital rights group Fight for Freedom described the tool as a threat to “privacy, security, democracy and freedom” and called on Apple to stop it permanently.

“Apple’s plan to scan photos and messages on the device is one of the most dangerous proposals of any technology company in modern history,” Fight for Freedom director Evan Greer said in a statement. “Technologically, this is the equivalent of installing malware on the devices of millions of people, malware that can be easily abused to do enormous harm.”

.Source