Apple’s current owners of iPhones, iPads, and Macs face multiple threats, but Apple’s new CSAM detection system has generated more controversy than the rest combined. And it only took another turn.
IPhone owners have been warned about serious privacy implications for the upcoming Apple … [+]
apple
In an impressive new post, Edward Snowden has delved into Apple’s CSAM (child sexual abuse material) detection system, which will arrive on Apple’s 1.65BN active iPhones, iPads and Macs next month. He states: “Apple’s new system, no matter how someone tries to justify it, will permanently redefine what belongs to you and what belongs to them.” It also shows what you can do about it, for now.
CSAM detection works by matching images of a user with illegal material. According to the new design, your phone will now perform these searches on behalf of Apple before your photos even reach their iCloud servers and … if enough ‘forbidden content’ is discovered, it will be notified law enforcement, ”Snowden explains. “Apple plans to erase the border that divides which devices work for you and which devices work for them.”
“The day after this system is up and running, it won’t matter if Apple ever enables end-to-end encryption, as our iPhones will report its contents before our keys are even used [his emphasis]Snowden says. In addition, while citing “convincing evidence” from researchers that Apple’s CSAM detection system is severely flawed, he draws attention to a much larger point:
“Apple is deciding whether or not to monitor its owners’ infringements government, but it is the government that decides what constitutes an infringement … and how to manage it “.
Apple’s CSAM detection system will hit Apple iOS 15 and macOS Monterey next month
Apple1
In addition, Snowden points out that the whole system can be easily avoided, which undermines the stated goal of its creation:
“If you’re an enterprising pedophile with a basement full of CSAM-contaminated iPhones, Apple welcomes you to completely exempt yourself from these scans by simply flipping the“ Turn off iCloud Photos ”switch, a bypass that reveals this system it was never designed to protect children the way you want them to believe, but rather to protect their brand. As long as you keep this stuff off their servers and keep Apple off the headlines, Apple doesn’t care. “
And, for those of you already thinking ahead, Snowden points out that there’s an obvious next step in this process: governments are forcing Apple to remove the option to turn off photo uploads to iCloud.
“If Apple demonstrates the ability and willingness to remotely search all phones in evidence of a particular type of crime, these are questions for which they will have no answer. And yet an answer will come, and it will come from the worst legislators of the worst governments. It’s not a slippery slope. It’s a cliff. “
Researchers have already pointed out all the ways in which Apple’s markets could be exploited and eliminated if it does not comply with government requests. There are already precedents here. In May, Apple was accused of engaging in censorship and surveillance in China after agreeing to transfer the personal data of its Chinese customers to the servers of a state-owned Chinese company. Apple also claims to have provided customer data to the U.S. government nearly 4,000 times last year.
Apple’s privacy statement from the official privacy homepage
apple
“I can’t think of any other company that has so proudly and so publicly distributed spyware on their own devices … There’s no fundamental technological limit to the extent to which Apple’s precedent can be pushed, that is, the only restriction is Apple’s overly flexible business policy, which governments understand all too well. ”
Interestingly, Snowden doesn’t touch on any other key threat: if Apple is hacked. Creating a backdoor on such a comprehensive detection system means that Apple may not be aware of how their devices are being scanned and manipulated.
“[Apple is] invent a world in which every product you buy owes its greatest loyalty to someone other than its owner. In short, this is not an innovation, but a tragedy, an ongoing disaster.
So far, Apple has defended its CSAM detection system saying it was poorly communicated. But last week the researchers, who worked on a similar system for two years, concluded that “the technology was dangerous,” saying “it baffled us to see that Apple had few answers to the tough questions we had come up with.”
CSAM will launch on iOS 15, iPadOS 15, watchOS 8 and macOS Monterey next month. I have contacted Apple for comment and will update this post when / if I receive a response.
In the meantime, I would advise all Apple fans to read the full Snowden article and make up their minds.
___
Follow Gordon Facebook
More about Forbes
The new defect of the iPhone’s iMessage allows the “Zero Click” hack
Researchers label Apple’s CSAM detection system “dangerous”