San Francisco doctor accused of possessing child pornography on iCloud

Amid controversy over Apple’s CSAM detection system, a doctor in the San Francisco Bay Area has been accused of possessing child pornography on his Apple iCloud account, according to federal authorities.

The U.S. Department of Justice announced Thursday that Andrew Mollick, 58, had at least 2,000 sexually exploitative images and videos of children stored on his iCloud account. Mollick is an oncology specialist affiliated with several medical services in the Bay Area, as well as an associate professor in the UCSF School of Medicine.

He also posted one of the images on the Kik social media app, according to the recently unsealed federal complaint (via KRON4).

Apple recently announced plans to introduce a system designed to detect child sexual abuse material (CSAM) in iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC). The system, which relies on cryptographic techniques to ensure user privacy, has caused controversy among the digital rights and cybersecurity communities.

The system does not scan the actual images of a user’s iCloud account. Instead, it relies on matching hashes of images stored in iCloud with known CSAM hashes provided by at least two child safety organizations. There is also a threshold of at least 30 pieces of CSAM to help mitigate false positives.

Documents revealed during the Epic Games vs. Apple The lawsuit indicated that Eric Friedman, Apple’s anti-fraud chief, thought the services of tech giant Cupertino were the “biggest platform to distribute” CSAM. Friedman attributed this fact to Apple’s firm stance on user privacy.

Despite the reaction, Apple continues to advance in its plans to launch the CSAM detection system. He maintains that the platform will still preserve the privacy of users who do not have CSAM collections in their iCloud accounts.

.Source