Silhouette of a mobile user seen next to a screen projection of the Apple logo in this photo illustration taken on March 28, 2018.
Atès Ruvic | Reuters
Apple said Friday it would delay a controversial plan to scan users’ photo libraries for child exploitation images.
“Last month we announced plans for functions aimed at protecting children from predators who use communication tools to recruit and exploit them and limit the dissemination of child sexual abuse material,” the company said in a statement . “Based on feedback from clients, advocacy groups, researchers and others, we have decided to devote additional time over the next few months to collecting input and making improvements before posting these very important child safety features.”
Shares of Apple fell slightly on Friday morning.
Apple immediately sparked controversy after announcing its system to check if users ’devices contained illegal child sexual abuse material or CSAM. Critics noted that the system, which can check images stored on an iCloud account with a known CSAM image database, disagreed with Apple’s messages about its customers’ privacy.
The system does not scan a user’s photos, but looks for known “fingerprints” that match the CSAM database. If the system detects enough images in a user’s account, it is flagged on a human monitor that can confirm the CSAM images and pass the information to law enforcement if necessary.
Apple’s CSAM detection system should be up and running for customers later this year. It’s unclear how long Apple will launch after Friday’s announcement.
Despite the controversy over Apple’s move, it’s actually a standard practice among tech companies. Facebook, Dropbox, Google and many others have systems that can automatically detect CSAM uploaded to their respective services.
This is breaking news. Please check for updates again.