Apple delays plans to scan devices for child abuse images after backlash over privacy

apple

Apple is temporarily pushing the pause button in its controversial plans to examine users’ devices against child sexual abuse material (CSAM) after receiving a continued rebound over concerns that the tool could be armed for mass surveillance and erode users’ privacy.

“Based on feedback from clients, advocacy groups, researchers, and others, we’ve decided to spend extra time over the next few months collecting contributions and making improvements before posting these very important child safety features,” the manufacturer said. of iPhone in a statement on its website.

However, the announcement does not make it clear what kind of reports it would collect, the nature of the changes it intends to conceive, or how it intends to implement the system in a way that mitigates the privacy and security issues that may arise once deployed. .

The changes were initially scheduled to be released with iOS 15 and macOS Monterey later this year, starting in the United States

In August, Apple detailed several new features designed to help limit the spread of CSAM on its platform, including scanning users’ iCloud Photos libraries for illegal content, the Message Security Communication app to warn children and their parents when they receive or send sexually explicit photos, and extended Siri and Search guidance when users try to search for CSAM-related topics.

The so-called NeuralHash technology would have worked by matching photos of users’ iPhones, iPads and Macs just before uploading them to iCloud Photos in a database of known child sexual abuse images maintained by the National Center for Missing Children and Exploited (NCMEC)) without having to own the images or collect their contents. ICloud accounts that exceeded a set threshold of 30 matching hashes would be manually reviewed, their profiles deactivated, and law enforcement reported.

Measures aimed at achieving a compromise between protecting customer privacy and meeting the growing demands of government agencies in investigations related to terrorism and child pornography, and, by extension, offer a solution to the so-called problem being exploited. of criminals encryption protections to hide their smuggling activities.

However, the proposals were fulfilled almost instantly, with the Electronic Frontier Foundation (EFF) calling on the technology giant to try to create a surveillance system on the device, adding “a documented document, carefully thought out and closely. The back door with scope is still a back door. “

“Once this capability is incorporated into Apple products, the company and its competitors will face enormous pressure (and potentially legal requirements) from governments around the world to scan photos not only for CSAM, but also for other images that a government deems unacceptable, ”The Center for Democracy & Technology (CDT) said in an open letter.

“These images can be of human rights abuses, political protests, images that companies have labeled as“ terrorist ”or violent extremist content, or even unflattering images of the politicians themselves who will pressure the company to look for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. So Apple will have laid the groundwork for censorship, surveillance and persecution worldwide, “the letter read. .

But in one e-mail distributed internally at Apple, child safety advocates were found rejecting complaints from privacy activists and security investigators as a “shouting voice of the minority.”

Since then, Apple has intervened to resolve possible concerns arising from unintended consequences, backtracking on the possibility that the system could be used to detect other forms of photos at the request of authoritarian governments. “Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” the company said.

However, it did nothing to allay fears that the customer’s scan could lead to worrying invasions of privacy and that it could be extended to further abuses and provide a plan to break end-to-end encryption. Nor did it help researchers be able to create “hash collisions” (also known as false positives) by reverse engineering the algorithm, which led to a scenario in which two completely different images generated the same hash value, thus fooling the system effectively into thinking. the images were the same when they are not.

“My suggestions to Apple: (1) Talk to the tech and political communities before you do what you want to do. Talk to the general public as well. This isn’t a new luxury touch bar – it’s a privacy commitment that affects 1 billion users, ”Johns Hopkins professor and security researcher Matthew D. Green he tweeted.

“Be clear about why you scan and what you scan. Going from scanning anything (except email attachments) to scanning everyone’s private photo library was a huge delta. Climbing like this needs to be justified,” Green added.

.Source