Why you should delete Google Photos from your iPhone, iPad, and Mac

When it comes to storing photos in the cloud, Google Photos is the leader in the package: four trillion photos and videos in more than a billion users. Millions of Apple users have Google Photos on their iPhones, iPads, and Macs, but Apple has just issued a serious warning about Google’s platform and has given its users a reason to remove their apps.

These have been a terrible few weeks for Apple in terms of privacy, not what the iPhone maker needs in the run-up to the launch of the iPhone 13 and iOS 15. A week ago, the company was uncomfortable (although inevitably) backtracked on its ill-conceived plan to project photos of its users to their devices to remove known images of child abuse.

MORE OF FORBESApple’s back tracks to scanning photos from the iPhone: for now

The CSAM projection is not controversial. All major cloud platforms:including Google Photos“I’ve been doing it for years.” “Child sexual abuse material doesn’t take place on our platforms,” Google told me. “It simply came to our notice then sketched, we use a wide range of industry standard scanning techniques, including hash matching technology and artificial intelligence to identify and remove CSAM that has been uploaded to our servers. ”

But Apple, it seems, does no been doing the same. The company has not yet applied any such projection to iCloud Photos and its reasoning for this seemingly surprising decision once again highlights the different philosophies of privacy at stake.

Apple’s controversy (now stopped), said the company, which wanted to mark the known images “without learning about non-CSAM images,” said the company, which decided to choose CSAM on the device and not in the cloud. What this means is that not all users should cede privacy all its content, to mark a tiny minority.

The principle itself is solid enough. If your private iPhone doesn’t mark any possible CSAM matches, Apple’s servers may ignore all of your content. If your iPhone does marks possible matches, at least 30, and the server knows exactly where to look.

The point, however, is that despite the detailed technical explanations and warranties, this concept of projecting to the device did not come in handy. That private iPhone filtering was simply found as spyware on the device, increasing the spectrum of the reach, of increasingly tagged content at the urging of U.S. and overseas governments. And so Apple has once again retired to its drawing board to rethink it.

But turn that around and there’s an interesting riddle for the rest of the industry. Apple has highlighted the invasion of privacy by searching all your photos in the cloud, which just matching CSAM databases would be welcome, but stay here? And what about the risks inherent in Apple’s technical details, around fake matches and manual reviews? Does this mean that our cloud photos from other platforms are marked and reviewed periodically by manual operators ’desktops?

Worse, the real problem that set Apple’s CSAM plans below the waterline was the risk that governments would expand beyond the known content of CSAM, picked up by child safety organizations to another. content. Political or religious dissent, other crimes, minorities persecuted in parts of the world where Apple sells its devices.

Apple explained in great detail that it had technical protections to make it difficult, promising to always say no. He then said that only the US would start and that it would only expand to countries where these risks could be contained. But it didn’t ensure the hectic privacy lobby, especially given Apple’s past challenges in “just saying no” to China, for example, on iCloud storage locations and censoring applications.

Clearly, you don’t need to be a technical genius to verify that these same risks apply to cloud screening and are not limited to device software. Yes, the jurisdiction in which cloud data is stored varies, but large technologies have yet to adhere to local laws, as is often made clear, and the defense that it is not technically possible, which is used to advocate for example message encryption cannot be applied.

And so on in Google Photos. There are three reasons why Apple users should delete these apps. First, using Google Photos means giving the platform full access to your photos. It’s all or nothing. Apple has a relatively new privacy preservation tool in its photo app, to limit the photos that any app can access. But Google Photos won’t accept it, insisting that you change the settings to give you access to everything if you want to use the app.

Second, the Google Photos privacy tag is a horror show compared to Apple’s alternative. As with other stock apps, Google (like Facebook) collects what it can, excusing it by only using data when needed. But the problem is that Google links all of this data to your identity, adding it to the broad profiles associated with your Google Account or other personal identifiers. Google does not do this as a service, it is the core of its data-driven advertising business model. Just follow the money.

Google says these tags “show all possible data that might be collected, but the actual data depends on the specific features you choose to use … We’ll collect contact information if you want to share your photos and videos … or if you decide “buy a photo book, we’ll collect your payment information and store your purchase history. But that data won’t be collected if you decide not to share photos or make a purchase.”

Google, like Facebook, will also collect metadata from photos and extract the data to its algorithm-based money machine. “We use EXIF ​​location data to improve the user experience in the app,” the company told me. “For example, to show a trip to our Memories feature or suggest a photo book from a recent trip.”

Clearly, everyone can have a view of the personal data they like to be incorporated into Google datasets for exploitation and analysis, and Google now offers more controls than ever before to restrict what is shared. But limiting Google access also limits its functionality. It is this basic philosophy at stake.

“Your photo and video albums are full of precious moments,” Apple contrasts with Google’s approach. “Apple devices are designed to give you control over those memories.” And at the core of that warranty, we have the same debate over the device versus the cloud that framed the CSAM controversy that plagued Apple last month.

Which leads to the third issue. We already know that Google applies AI to the cloud in the photos it stores. Behind Apple’s CSAM move was a well-established approach to analyzing your device’s data. Apple uses ML on your device to sort photos, for example, by allowing you to intelligently search for objects or people. Google does it in the cloud. And when Apple’s CSAM issue linked this device’s ML to external processing, Google’s cloud ML is already external, outside of the device, a relative black box for users.

When Apple says its Photos’ platform is designed so that facial recognition and detection of scenes and objects (power functions like For You, Memories, Sharing Tips, and the People album) occur on the device in cloud location … And when apps if you request access to your photos, you can share only the images you want, not your entire library, ”we know exactly who they have in mind.

In its CSAM approach to Google Photos, the company told me this we work closely with the National Center for Missing and Exploited Children and other agencies around the world to combat this type of abuse. ”

But Google would not rely on my other questions: the protection of the privacy of Google Photos, the limitations and restrictions on detection, its policy of government applications (foreign or domestic), if it had been asked to extend the scope of its detection. that point me to their general content advertising policies (not metadata, it will be noticed) and their transparency report.

Google also did not comment on other AI classifiers that apply to Google Photos, how the data is collected and used, and whether it intends to review anything in light of Apple’s reaction. There’s no implication for Google to do anything more than the obvious, but that happens with the cloud, it’s actually just someone else’s computer.

Just as we exposed Facebook to obtain EXIF ​​data without any user transparency, the problem is delving into the general terms and conditions for understanding what this really means to you. And when the analysis is done outside of the device, it’s totally invisible to you, unless you decide to share. That was kind of an Apple point about CSAM.

Is there a risk here? Yes, of course. Apple has told you a lot. We know that Google adopts an architecture that preserves privacy much less than Apple in any case. Therefore, you should interact with their apps and platforms with your eyes wide open.

In the meantime, if you’ve spent more than $ 1,000 on your iPhone, my recommendation is to make use of the privacy measures it has. And that means skipping Google Photos despite the advanced search features it may have. As always, comfort comes at a price; absent full up transparency and controls, this price is still too heavy to pay.

.Source