According to cryptographers at Johns Hopkins University (iOS), iOS does not use built-in encryption measures as much as it could, allowing for potentially unnecessary security vulnerabilities. With cable).
Using publicly available documentation from Apple and Google, law enforcement reports on circumventing mobile security features, and their own analysis, cryptographers assessed the robustness of iOS and Android encryption. The research found that while the encryption infrastructure on iOS “sounds great,” it is largely left unused:
“On iOS in particular, there is the necessary infrastructure for this hierarchical encryption that sounds great,” said Maximilian Zinkus, lead researcher on iOS. “But I was definitely surprised to see then how much is not used.”
When an iPhone is started, all the stored data is in “Full Protection” and the user must unlock the device before anything can be decrypted. While this is extremely secure, the researchers noted that once the device is unlocked for the first time after reboot, a large amount of data goes into a state that Apple calls “Protected until first user authentication.”
Because devices rarely restart, most data is in a “Protection until first user authentication” state instead of “Full protection.” The advantage of this less secure state is that decryption keys are stored in quick access memory, where applications can access them quickly.
In theory, an attacker could find and exploit certain types of security vulnerabilities in iOS to obtain encryption keys in the quick access memory, allowing them to decrypt large amounts of data from the device. This is believed to be how many smartphone access tools work, such as those from forensic access company Grayshift.
While it is true that attackers require a specific operating system vulnerability to access the keys, and both Apple and Google correct many of these errors as they are noticed, it can be avoided by hiding the encryption keys more deeply.
“It really impacted me, because I got into this project thinking that these phones really protected users’ data well, ”says Johns Hopkins cryptographer Matthew Green. “Now I’ve left the project thinking that almost nothing is protected as much as it could be. Why do we need a back door for law enforcement when the protections these phones offer are so bad?”
The researchers also directly shared their findings and various technical recommendations with Apple. An Apple spokesman offered a public statement in response:
“Apple devices are designed with multiple layers of security to protect against a wide range of potential threats, and we’re constantly working to add new protections to our users’ data. As customers continue to increase the amount of confidential information they stored on your devices, we will continue to develop additional protections in both hardware and software to protect your data. “
The spokesman also said so With cable that Apple’s security work focuses primarily on protecting users from hackers, thieves and criminals who want to steal personal information. They also noted that the types of attacks the researchers highlighted are very expensive to develop, require physical access to the target device, and only work until Apple launches a patch. Apple also stressed that its goal with iOS is to balance security and comfort.